This is why LLMS need to pay for content, they need to get the answers RIGHT!!!! ✔️ New research from the European Broadcasting Union (EBU) and the BBC found that leading AI assistants misrepresent news content in almost half of their responses. ✔️ The study analyzed 3,000 responses from AI assistants like ChatGPT, Copilot, Gemini, and Perplexity across 14 languages. ✔️Overall, 45% of AI responses had significant issues, and 81% had some form of problem. ✔️ A third of AI responses showed serious sourcing errors, with Gemini exhibiting significant issues in 72% of its responses. ✔️Accuracy issues, including outdated information, were found in 20% of the responses. ✔️ The EBU warned that unreliable AI news could erode public trust and deter democratic participation. ✔️ The report urges AI companies to improve their assistants' accuracy in responding to news-related queries.
We are doing valid testing and the errors are tedious to track but happen at a surprisingly high rate
By the time Google wakes up to this reality there will be no quality content on the web that is not behind a login...
Very salient information.
It’s great we are all now blocking the bots w/o compensation… but now we need to turn the corner and get realtime information into these platforms with fair economics to content owners. It was bad enough when Facebook became the #1 source of news for most Americans…
Thank you for highlighting this massive effort Matthew Scott Goldstein (msg) … I would love to see some publishers with direct integrations into these platforms (namely, ChatGPT) being more transparent about the accuracy of their outputs. Or are they bound by confidentiality?
Reporter for generative-ai-newsroom.com
2wIf you are interested in how thr BBC conducted the original study published I. February, I talked to the data scientist: How to Check if AI Assistants Are Distorting Your News Stories https://generative-ai-newsroom.com/how-to-check-if-ai-assistants-are-distorting-your-news-stories-1fd7d2002a1c