• RememberTheApollo_@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    1
    ·
    1 year ago

    Odd.

    I can’t see having a conversation with a computer as having a conversation. I grew up with computers from the Atari stage and played around with several publicly accessible computer programs that you could “chat” with.

    They all suck. Doesn’t matter if it’s a “help” program, a phone menu, website help, or even having played around with chatGPT…they’re not human. They don’t respond correctly, they get too general or generic in answers, they repeat, there’s just too many giveaways that you’re not having a real conversation, just responses from a system that’s trying to pick the most likely response that fits the pattern.

    So how are people having “conversations” with a non-living entity?

    • Hobo@lemmy.world
      link
      fedilink
      English
      arrow-up
      38
      ·
      edit-2
      1 year ago

      It’s escapism I think. At least that’s part of it. Having a machine that won’t judge you, will serve as a perfect echo chamber, and will immediately tell you AN answer can be very appealing to some. I don’t have any data, or any study to back it up, just my experience from seeing it happen.

      I have a friend who I feel like I kind of lost to chatgpt. I think he’s a bit unhappy with where he is in life. He got the good paying job, the house in the suburbs, wife, and 2.5 kids, but didn’t ever think about what was next. Now he’s just a bit lost I think, and somehow convinced himself that people weren’t as good as chatting with a bot.

      It’s weird now. He spends long nights and weekends talking to a machine. He’s constructed elaborate fictional worlds within his chatgpt history. I’ve grown increasingly concerned about him, and his wife clearly is struggling with it. He’s obviously depressed but instead of seeking help or attempting to figure himself out, he turned to a non-feeling, non-judgmental, emotionless tool for answers.

      It’s a struggle to talk to him now. It’s like talking to a cryptobro at peak btc mania. The only thing that he wants to talk about is LLMs. Trying to bring up that maybe spending all your time talking to a machine is a bit unhealthy invokes his ire and he’ll avoid you for several days. Like a herion addict struggling with addiction, even pointing out the obvious flaws in what he’s doing makes him distance himself more from you.

      I’m not young, not old exactly either, but I’ve known him for 25 years in my adult life. We met in college and have been friends ever since. I know many won’t quite understand but knowing someone that long, and remaining close, talk every few days, friends is quite rare. At this point he is my longest held friendship and I feel like I’m losing him to a robot. I’ve lost other friends to addiction in my life and to say that it’s been similar is under stating it. I don’t know what to do for him. I don’t know if there’s really anything I CAN do for him. How do you help someone that doesn’t even think they have a problem?

      I guess my point is, if you find someone who is just depressed enough, just stuck enough, with a particular proclivity towards computers/the internet then you have a perfect canidate for falling down the LLM rabbit hole. It offers them an out to feeling like they’re being judged. They feel like the insanity it spits out is more sane than how they feel now. They think they’re getting somewhere, or at least escaping their current situation. Escapism is very appealing when everything else seems pointless and sort of gray I think. So that’s at least one type of person that can fall down the chapgpt/LLM rabbit hole. I’m sure there’s others out there too with there own unique motivations and reason’s for latching onto LLMs.

      • okmko@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        1 year ago

        Wow, thank you for sharing your experience.

        How are you not higher voted. People on Lemmy complain about not having longform content that offers a unique perspective like on early Reddit, but you’ve written exactly that.

      • RememberTheApollo_@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        Guess that should have crossed my mind. People marrying human-like dolls and all that. One gets so far down the hole of whatever mental issues are plaguing the mind and something inanimate that only reflects what you want to see becomes the preferable reality.

      • mbp@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Awesome perspective! I’ve worked with and around seriously depressed, possession hoarders for around a year and quite the majority were the type to call you randomly ultimately to chat about something or another. The exact priming situation that would fall into abusing LLM tech if offered easy access to it. This was before the days of Chatgpt but I do worry some of my old clients are falling into this situation but with far less nuance than your friend.

      • RememberTheApollo_@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Until someone(thing?) else comes along we have only ourselves to judge reality. Maybe AI will decide we aren’t real at some point…