• hoot@lemmy.ca
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      3
      ·
      9 months ago

      I am concerned to think of all the terrible and just plain wrong information you have been given.

        • RidcullyTheBrown@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          9 months ago

          Most of the time, information that you’re doing something wrong should be enough to prompt you to dig deeper into the matter. It’s not the job of perfect strangers to educate you.

            • RidcullyTheBrown@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              1
              ·
              9 months ago

              You should tattoo “I’m wrong but I don’t want to educate myself” on your forehead so people know not to waste their time.

              • unconfirmedsourcesDOTgov@lemmy.sdf.org
                link
                fedilink
                English
                arrow-up
                3
                ·
                9 months ago

                Lemmy has a tagging system, definitely recommend tagging this user so you get a warning that they might be wrong but don’t want to educate themselves, then you can just ignore them and move on with your day.

    • FluffyPotato@lemm.ee
      link
      fedilink
      English
      arrow-up
      11
      ·
      9 months ago

      The problem with using it as a search engine is that if it doesn’t know the answer it commonly makes things up. I tried using it for work but it got details wrong enough to make it useless.

        • FluffyPotato@lemm.ee
          link
          fedilink
          English
          arrow-up
          5
          ·
          9 months ago

          You could in the past. About 6 years ago or so the top 3 results were almost always correct.

          Currently you can’t because of the AI generated content that gets things wrong the same way as using an AI as a search engine.