The article mentions more research is needed to confirm if the effect is long-lasting, but personally I’m happy someone may have found a good, practical usecase for LLMs.
The article mentions more research is needed to confirm if the effect is long-lasting, but personally I’m happy someone may have found a good, practical usecase for LLMs.
As long as they’re not hallucinating, which anyone (including conspiracy theorists) can ask them to do. They they turn into conspiracy confirming machines.
Or Truth Social LLM is released so you the entire thing is a hallucination.