Google’s AI created a glue-eating stir, then learned to recommend glue again from the ensuing coverage

Google’s AI created a glue-eating stir, then learned to recommend glue again from the ensuing coverage


Summary

  • Gemini AI has been a hot mess in Google Search Overviews, with inaccurate and dangerous results popping up.
  • User queries sometimes trigger nonsensical responses like adding glue to pizza, despite Google’s efforts to fix it.
  • Even with Google’s attempts to restrict certain content, AI Overviews can still show inaccurate information, like the infamous pizza glue incident that is repeating itself as the AI sees news coverage of the first incident.




Google’s Gemini is solid family of AI models that power almost all AI aspects of Google’s phones and applications, but it has been a hot mess since it started powering the AI Overviews feature for Google Search.

The feature, which had been in beta for months but first rolled out in May, was announced at Google I/O 2024. Yes, the event where the tech giant said “AI” more than 120 times on stage. Overviews are essentially concise summaries of information that users are searching for compressed into an easy-to-digest information card. One week in, Google announced that the feature will soon include sponsored ads in AI Overviews. While users weren’t particularly happy about ads within Overview, all hell broke loose after a subsequent blunder.


Related

Google is in damage control mode following AI Overviews fiasco

Some of the examples found online were ‘doctored,’ Google says

The AI Overview feature started showing users inaccurate and sometimes dangerous results, like suggesting that astronauts found cats on the moon, advising users to eat rocks, suggesting users to add glue to their pizza to ensure its cheese sticks nicely, alongside a range of other results that Google later said might have been doctored.

The tech giant soon went into damage control mode, and removed several of these inaccurate and dangerous results. Subsequently, it published a blog post detailing the incident and what went wrong. According to Google, the AI feature was picking information for some user queries from satirical articles and Reddit comments, which resulted in dangerous recommendations like adding glue to pizza. The tech giant added that it has now developed mechanisms to detect “nonsensical queries,” and the AI Overview feature will not be triggered in those instances. Additionally, it has restricted the inclusion of satirical and humorous content in Overviews.



Despite the measures, Google is still stuck in the pizza glue loop

Google's AI Overviews is still recommending adding glue to the pizza, quoting credible journalists' article talking about AI Overviews' initial blunder.

Source: Colin McMillen

As found by Colin McMillen, a former Google employee, AI Overviews is still talking about adding glue to pizzas. When asked “how much glue to add to pizza,” the AI feature did not replicate its old mistake, and instead, opted to look for information in credible places. The correct answer here, as the Overviews feature should know, is none. However, a recent article talking about Google’s initial pizza glue blunder by Business Insider’s Katie Notopoulos, where she actually makes and taste tests a Pizza with 1/8 cup of non-toxic glue, was picked up by Overviews. The feature did adhere to the recent adjustments made, and looked for ‘accurate’ information from a credible source.


It appears that the tech giant is “taking swift action” yet again, as we weren’t able to recreate McMillen’s results. Up until yesterday, as confirmed by The Verge, the obviously dangerous results were still up.

AI Overviews are currently only available in the US (sigh of Canadian relief). If you’re in the US, there are a few ways you can use to avoid seeing AI Overviews.

Related

How to turn off Google’s AI Overview feature

There is no direct way, but you can use these workarounds to get rid of AI Overviews from search results



Source link

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *