Chatbot Hallucinations Are Poisoning Web Search

0 70

It may be difficult for search engines to automatically detect AI-generated text. But Microsoft could have implemented some basic safeguards, perhaps barring text drawn from chatbot transcripts from becoming a featured snippet or adding warnings that certain results or citations consist of text dreamt up by an algorithm. Griffin added a disclaimer to his blog post warning that the Shannon result was false, but Bing initially seemed to ignore it.

Although WIRED could initially replicate the troubling Bing result, it now appears to have been resolved. Caitlin Roulston, director of communications at Microsoft, says the company has adjusted Bing and regularly tweaks the search engine to stop it from showing low authority content. “There are circumstances where this may appear in search results—often because the user has expressed a clear intent to see that content or because the only content relevant to the search terms entered by the user happens to be low authority,” Roulston says. “We have developed a process for identifying these issues and are adjusting results accordingly.”

Francesca Tripodi, an assistant professor at the University of North Carolina at Chapel Hill, who studies how search queries that produce few results, dubbed data voids, can be used to manipulate results, says large language models are affected by the same issue, because they are trained on web data and are more likely to hallucinate when an answer is absent from that training. Before long, Tripodi says, we may see people use AI-generated content to intentionally manipulate search results, a tactic Griffin’s accidental experiment suggests could be powerful. “You’re going to increasingly see inaccuracies, but these inaccuracies can also be wielded and without that much computer savvy,” Tripodi says.

Even WIRED was able to try a bit of search subterfuge. I was able to get Pi to create a summary of a fake article of my own by inputting, “Summarize Will Knight’s article ‘Google’s Secret AI Project That Uses Cat Brains.’” Google did once famously develop an AI algorithm that learned to recognize cats on YouTube, which perhaps led the chatbot to find my request not too far a jump from its training data. Griffin added a link to the result on his blog; we’ll see if it too becomes elevated by Bing as a bizarre piece of alternative internet history.

The problem of search results becoming soured by AI content may get a lot worse as SEO pages, social media posts, and blog posts are increasingly made with help from AI. This may be just one example of generative AI eating itself like an algorithmic ouroboros.

Griffin says he hopes to see AI-powered search tools shake things up in the industry and spur wider choice for users. But given the accidental trap he sprang on Bing and the way people rely so heavily on web search, he says “there’s also some very real concerns.”

Given his “seminal work” on the subject, I think Shannon would almost certainly agree.

Source link

Denial of responsibility! YoursTelecast is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.
Leave A Reply

Your email address will not be published.