Google Admits Its AI Overview Search Feature Is Glitchy
When there are bizarre and misleading answers to search queries generated by Google’s new search engine Features AI Overview spread on social media last week, the company issued statements that generally downplayed the notion that the technology was problematic. Late on Thursday, the company’s head of search, Liz Reid, acknowledged that the errors highlighted areas for improvement, writing: “We want to explain what happened and the steps we have taken.”
Reid’s post directly addressed two of the most viral and wildly inaccurate AI Overview results. People see Google’s algorithm attesting eat rocks because doing so “might be good for you” and the other person suggested using non-toxic glue to thick pizza sauce.
Eating ice is not a topic that many people write about or ask about online, so there aren’t many sources for search engines to mine. According to Reid, AI tools are found an article from The Oniona satirical website was reposted by a software company and it misinterpreted the information as fact.
Regarding Google asking users to put glue on pizza, Reid attributed the error to a failure of humor. “We have seen AI Overview with satirical or troll-y content from discussion forums,” she wrote. “Forums are often a great source of first-hand, factual information, but in some cases can lead to less helpful advice, such as using glue to stick cheese to pizza. ”
It’s probably best not to make any kind of AI-generated dinner menu without reading it carefully first.
Reid also suggested that judging the quality of Google’s new search based on viral screenshots would be unfair. She claims the company did extensive testing before launch, and its data shows people appreciate AI Overview, including showing that people are more likely to stay on the page. be discovered that way more.
Why are there embarrassing failures? Reid describes the mistakes that gained attention as the result of an internet-wide audit that was not always well-intentioned. “There’s nothing like having millions of people using this feature with many new searches. We’ve also seen new searches that don’t make sense, seemingly aimed at generating erroneous results.”
Google claims some widely distributed screenshots of its erroneous AI Overview are fake, which appears to be true based on WIRED’s own testing. For example, user on X posted a screenshot which appears to be the AI Overview that answers the question “Can a cockroach live in your penis?” with enthusiastic confirmation from search engines that this is normal. The post has been viewed more than 5 million times. However, upon closer inspection, the format of the screenshot does not match how the AI Overview is actually presented to the user. WIRED couldn’t recreate anything close to that result.
And it’s not just users on social media who have been fooled by misleading screenshots of the fake AI Overview. New York Times correction has been made to their report on the feature and clarified that AI Overview never suggested users should jump off the Golden Gate Bridge if they were depressed — it was just a dark social media meme. “Others implied that we returned dangerous results for topics such as leaving dogs in cars, smoking while pregnant and depression,” Reid wrote on Thursday. “Those AI overviews have never appeared.”
However, Reid’s post also makes clear that not all is as it seems with Google’s initial big new search upgrade. She wrote that the company has made “more than a dozen technical improvements” to the AI Overview section.
There are only four described: better detection of “meaningless queries” that are not worthy of an AI Overview; make the feature less dependent on user-generated content from sites like Reddit; provide AI Overviews less frequently in situations where users do not find them helpful; and strengthen protections to disable AI summaries on important topics like health.
Reid’s blog post makes no mention of significant retraction of AI briefs. Google said it will continue to monitor feedback from users and adjust features as needed.