Google has everything you need to take on ChatGPT – here’s what’s already been shown

ChatGPT’s ability to respond to questions in a conversational and direct manner has led some to declare that chat AI will kill the traditional search engine. Google is taking this seriously, and it should be more than competitive – which it has already shown. The question is user experience.

questions and answers

Basically, Google’s mission to “organize the world’s information and make it accessible and useful to all” can be divided into two components.

Users ask questions and Google provides answers. Queries – first keywords, then naturally formulated questions – were originally written into a box and later spoken. Answers started as links to websites that might contain relevant information, but has also evolved.

Google is starting to provide instant answers to simpler questions that are more or less facts, using information from databases, lists and, often, Wikipedia. This shift to direct responses coincides with smartphones, and their relatively smaller screens have become the primary device. Then came wearables and other audio devices such as smart speakers and monitors.

Other questions can’t be easily answered, but Google still tries and uses something called a feature snippet, or direct quotes from a website that it thinks will answer your question. In recent years, Google has been criticized for these snippets from all sides. Sometimes it chooses to quote an obviously wrong source, while the owners of that content blame Google for conspiratorial click-stealing to keep users searching.

This same type of complex question is something ChatGPT excels at by being able to generate an answer to many things rather than sending you somewhere else. Early adopters have taken to this, and they believe that the future of search will include getting straight answers all the time with a back and forth with the ability to ask for a follow-up. In fact, ChatGPT can also ask questions to clarify your query as needed. At the same time, it can also debug code, write articles (with the ability to select paragraphs), summarize, annotate, and much more.

What Google has |

lambda

Google has been working on the same language model technology that supports ChatGPT for some time, albeit in a less flashy way. However, she has given her work on natural language understanding (NLU) and centralized invoicing of large language models at I/O to two developer conferences in a row now.

LaMDA (Language Model for Dialog Applications) is Google’s “most advanced conversational AI to date.” It was revealed at I/O 2021 to “talk about any topic,” with the caveat that it’s still in the research and development phase. Google’s examples of talking to Pluto and Kite were meant to show how LaMDA “captures many of the nuances that characterize open conversations,” including sensible, specific responses that encourage more back-and-forth.

Other adjectives Google wants are “interesting” (whether the responses are insightful, unexpected, or witty) and “realism,” or sticking to the facts.

A year later, LaMDA 2 was announced, and Google began allowing public experimentation with three specific examples of LaMDA using the AI ​​Test Kitchen app.

my mom

Along with LaMDA, Google has highlighted multimedia forms that “allow people to ask questions naturally across different types of information” using MUM (Multi-Task Unified Form). Of note is the example query given by Google that the search engine can’t answer today, but is something this new technology can handle:

I climbed Mt. Adams and now want to hike Mt. Fuji next fall, what should I do differently to prepare?

MUM will understand that you are comparing two mountains, and the time range you provided is Mt. Fuji’s rainy season, and therefore requires waterproof equipment. Articles written in Japanese can appear where there is more local information, while the most impressive example was somewhat related to Google Lens:

So now imagine taking a picture of your hiking boots and asking, “Can I use them for hiking Mount Fuji?” MUM will be able to understand the content of the image and the intent of your inquiry, tell you which hiking boots will work just fine and then direct you to a list of recommended gear and the Mt. Fuji.

This was still an exploratory query, but more specifically, Google announced how to add MUM to Lens so that you can take a picture of a broken part of your bike (which I’m not aware of) and get instructions on how to fix it.

palm, palm

If MUM allows questions to be asked using a variety of media and LaMDA can follow conversations, then PaLM (Paths Language Model) is what can answer questions. It was announced in April and received an onstage mention at I/O. PaLM can do the following:

Question answering, semantic analysis, proverbs, arithmetic, code completion, general knowledge, reading comprehension, summarization, logical reasoning strings, logical reasoning, pattern recognition, translation, dialogue, humor explanations, physics question and answer, and language comprehension.

It is powered by a next-generation AI architecture called Pathways that can “train a single model to do thousands or millions of things” compared to the current highly individualized approach.

Down to the products

When Google announced LaMDA in 2021, Sundar Pichai said that its “natural conversational capabilities have the potential to make information and computing more accessible and easier to use.”

The names “Google Assistant,” “Search,” and “Workspace” were specifically examined as products they hoped would “merge.”[e] Better conversation features. Google could also offer “capabilities for developers and enterprise customers.”

In this post-ChatGPT world, more than a few people have commented that direct responses can hurt Google’s ad-driven business model, with the reasoning that people will no longer need to click on links if they actually get the answer. In the examples provided by Google, there is no indication that they want to stop linking to the content.

There are significant safety and accuracy concerns, which Google has always emphasized when demoing. The fact that these models “can make things” seems to be the biggest bottleneck more than anything else.

Meanwhile, it’s not clear if people want every interaction with a search engine to be a conversation. However, Google has internally acknowledged that the conversational approach “really meets a need that people seem to have.”

Google is said to be red over ChatGPT and reassigning various teams to work on competing AI products and offerings. Another showcasing of the technology at I/O 2023 is more than likely, but whether this means LaMDA, MUM, and PaLM will be more prominently integrated into Google’s biggest products remains to be seen.

Back in May, Pichai reiterated how “conversational and natural language processing are two powerful ways to make computers more accessible to everyone.” Of everything the company has sampled, the ultimate goal is to make Google search able to answer questions like a human.

Unsurprisingly, Google has the technology to get there, but the company’s eternal challenge is to transfer research and development into actual products, and it doesn’t seem wise to a search engine that the world needs to be constantly correct.

FTC: We use affiliate links to earn income. more.


Check out 9to5Google on YouTube for more news:

#Google #ChatGPT #heres #whats #shown

Leave a Reply

Your email address will not be published. Required fields are marked *