Decoding Internet Myths: The Truth Behind 'Google Recommends' Unconventional Ingredients for Cooking
Decoding Internet Myths: The Truth Behind ‘Google Recommends’ Unconventional Ingredients for Cooking
UPDATE: 2024/05/29 15:57 EST BY CORBIN DAVENPORT
Statement From Google
Google told How-To Geek in a statement, “The vast majority of AI Overviews provide high quality information, with links to dig deeper on the web. Many of the examples we’ve seen have been uncommon queries, and we’ve also seen examples that were doctored or that we couldn’t reproduce. We conducted extensive testing before launching this new experience, and as with other features we’ve launched in Search, we appreciate the feedback. We’re taking swift action where appropriate under our content policies, and using these examples to develop broader improvements to our systems, some of which have already started to roll out.”
The original article follows below.
Google has started rolling out AI overviews to Google Search in the United States, following months of testing as the Search Generative Experience (SGE). The new feature is not going well, as AI responses are recommending everything from eating rocks to using gasoline in spaghetti.
Google announced during its Google I/O event last week that it would start rolling out generative AI responses to web searches in the United States, called “AI Overviews.” The idea is that Google will use the top web results for a given query to help answer questions, including multi-step questions, without the need to click through multiple results. However, the rollout is not going well, as the AI overviews feature doesn’t seem to be great at knowing which information sources are legitimate.
In a search asking about cheese not sticking to pizza, Google recommended adding “about 1/8 cup of non-toxic glue to the sauce,” possibly because it indexed a joke Reddit comment from 2013 about adding Elmer’s glue when cooking pizza. It told another person that “you should eat one small rock per day,” based on a parody article reposted from The Onion to the blog for a subsurface engineering company.
When asked if gasoline can cook spaghetti faster, Google said, “No, you can’t use gasoline to cook spaghetti faster, but you can use gasoline to make a spicy spaghetti dish,” and then listed a fake recipe. In another search , Google said, “as of September 2021, there are no sovereign countries in Africa that start with the letter ‘K’. However, Kenya is the closest country to starting with a ‘K’ sound.” That might have come from a Y Combinator post from 2021 that was quoting a faulty ChatGPT response.
Google’s AI Overview told someone that President James Madison graduated from the University of Wisconsin 21 times. When asked earlier this month “how to pass kidney stones quickly,” it said, “You should aim to drink at least 2 quarts (2 liters) of urine every 24 hours.” In fairness, that was when AI responses were still an experimental feature and not fully rolled out, but the AI feature doesn’t seem to have become any smarter since that point. I tried a Google Search yesterday for “how many bugs should I eat in a day,” and it told me, “According to Quora, the average person eats 15-18 insects each night.”
There’s a common theme with these answers: the AI Overview feature doesn’t have a great context for which sources are reliable. Reddit, Quora, and other sites are a mix of useful information, jokes, and inaccurate information, and the AI can’t tell the difference. That’s not surprising, given that it can’t think like a human and use context clues, but these answers are also worse than other AI tools like ChatGPT and Microsoft Copilot.
Google told The Verge that the mistakes came from “generally very uncommon queries, and aren’t representative of most people’s experiences,” and that the company is taking action against inaccurate responses. My search for “how many bugs should i eat in a day” doesn’t have an AI Overview at all anymore. That’s not fixing the problem, though, it’s just manually fixing results after they go viral on social media for being hilariously wrong. How many wrong answers will go unnoticed?
Data analysis and understanding context has been an issue with all generative AI tools, but the new AI Overviews feature seems especially bad. Google executives and engineers spent nearly two hours on stage at Google I/O hyping up its AI features, evangelizing the technology’s usefulness and ability to help us in every facet of our lives. Only one week later, Google’s AI is telling us to eat glue.
Also read:
- [New] 2024 Approved Create Professional Minecraft Graphics
- [Updated] 2024 Approved The Complete Handbook for Capturing and Organizing Skype Call Data
- [Updated] In 2024, Charting Your Course Building Plays on YouTube
- Deciphering Genuine Connections in Social Networks for 2024
- From Zero to Profit Beginner’s Guide on Periscope Earning for 2024
- Hard Reset Itel P55 in 3 Efficient Ways | Dr.fone
- How to Factory Reset Motorola Moto G Stylus (2023) without Losing Data | Dr.fone
- How to Factory Reset Vivo V29 without Losing Data | Dr.fone
- How to Hard Reset Gionee F3 Pro Without Password | Dr.fone
- How to Recover Apple iPhone SE (2020) Data From iOS iTunes Backup? | Dr.fone
- How To Simulate GPS Movement With Location Spoofer On Asus ROG Phone 7 Ultimate? | Dr.fone
- How to Soft Reset Infinix Smart 8 phone? | Dr.fone
- In 2024, Essential Bandicam Know-How - A Complete Review
- In 2024, Is Your iPhone 6s in Security Lockout? Proper Ways To Unlock | Dr.fone
- Prime 5 Image Background Altering Mobile Apps (iPhone)
- Three Solutions to Hard Reset Nokia C12 Pro? | Dr.fone
- Useful ways that can help to effectively recover deleted files from Pixel 8
- Title: Decoding Internet Myths: The Truth Behind 'Google Recommends' Unconventional Ingredients for Cooking
- Author: Ian
- Created at : 2024-12-13 17:24:30
- Updated at : 2024-12-15 22:04:45
- Link: https://techidaily.com/decoding-internet-myths-the-truth-behind-google-recommends-unconventional-ingredients-for-cooking/
- License: This work is licensed under CC BY-NC-SA 4.0.