scroll to top arrow or icon

Google Search is making things up

Smartphone with Google search​

Smartphone with Google search

IMAGO/Filippo Carlot via Reuters Connect
Google has begun adding artificial intelligence-generated answers when users type questions into its search engine. Many people have found the AI-generated answers ranging from simply bizarre to flat-out wrong. The search engine’s AI Overviews feature has told users to put glue on pizza to keep the cheese from falling off, that elephants only have two feet, and that you should eat one rock per day for nutritional value. It even told me that, in fact, dogs have played in the National Football League.

Google has defended its new feature, saying that these strange answers are isolated incidents. “The vast majority of AI overviews provide high-quality information, with links to dig deeper on the web,” the tech giant told the BBC. The Verge reported that Google is manually removing embarrassing search results after users post what they find on social media.

This is Google’s second major faux pas in its quest to bring AI to the masses. In February, after it released its Gemini AI system, its image generator kept over-indexing for diverse images of individuals — even when doing so was wildly inappropriate. It spit out Black and Asian Nazi soldiers and Native Americans dressed in Viking garb.

The fact that Google is willing to introduce AI into its cash cow of a search engine signals it is serious about integrating the technology into everything it does. It’s even decided to introduce advertising into these AI Overviews. But the company is quickly finding out that when AI systems hallucinate, not only can that spread misinformation — but it can also make your product a public laughingstock.


Subscribe to GZERO's daily newsletter