Mad_Punda.de@feddit.detoTechnology@lemmy.world•CEO of Google Says It Has No Solution for Its AI Providing Wildly Incorrect InformationEnglish
521·
7 months agothese hallucinations are an “inherent feature” of AI large language models (LLM), which is what drives AI Overviews, and this feature "is still an unsolved problem”.
Then what made you think it’s a good idea to include that in your product now?!
It’s not missing from the discussion, since the HH publisher literally mentioned sales numbers and that it’s a solo dev? I’m confused what you mean.