The article mentions more research is needed to confirm if the effect is long-lasting, but personally I’m happy someone may have found a good, practical usecase for LLMs.
The article mentions more research is needed to confirm if the effect is long-lasting, but personally I’m happy someone may have found a good, practical usecase for LLMs.
Have you even read the article?
Does the article say the headline is wrong? Or does it say conspiracy theorists listen to facts because it relies on a handful of willing participants who changed their mind when seeing facts and reports? Because that’s not the crux of the crazy conspiracy theorists.
Try again when the chatbot talked to the likes of Graham Hancock or the hardcore MAGA death cult. Facts don’t matter.
Just look at this guy who straight up pretends that no one tried to talk to them before.
It does talk about gish gallop at the very end, and claims that the chatbot can keep presenting arguments - but doesn’t actually say that it has worked.
No just the headline. Article was good?