AI girlfriend bots are already flooding OpenAI’s GPT store::OpenAI’s store rules are already being broken, illustrating that regulating GPTs could be hard to control
AI girlfriend bots are already flooding OpenAI’s GPT store::OpenAI’s store rules are already being broken, illustrating that regulating GPTs could be hard to control
…i may be too green to see something here, but wouldn’t simply saving month, year, topic, mood and quote enough? If AI needs everything formatted in certain input, run this through API. Teach AI to save only moments where user uses agitated language or smth and to periodically run checks if current convo allows for throwback, for example by topic, with advanced query when user asks if AI remembers something.
Then sell all this data for fat profit.
So imagine a convo:
1 years later:
Now the AI can find the meesage that said ‘Interstellar’ in the history but without any context. To know you were talking about the movie it would have to analyze the entire conversation again. And the emotional charge of the message can also change instantly:
What would the AI ‘remember’? It would require some higher level of understanding of the conversation and the ‘memories’ would have to be updated all the time. It’s just not possible to replicate with simple log.
Thanks for examples, now yeah, that’s really ain’t that simple…and hard af to foolproof. :/
It’s when people dive into this sort of memory stuff that I always remember: “oh yeah, this is why people call it a stochastic parrot.”
LLMs can do a lot. But without memory, they run into walls fast.