Data poisoning: how artists are sabotaging AI to take revenge on image generators::As AI developers indiscriminately suck up online content to train their models, artists are seeking ways to fight back.

  • RagingRobot@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    11 months ago

    So it sounds like they are taking the image data and altering it to get this to work and the image still looks the same just the data is different. So, couldn’t the ai companies take screenshots of the image to get around this?

    • The Uncanny Observer@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      1
      ·
      11 months ago

      Not even that, they can run the training dataset through a bulk image processor to undo it, because the way these things work makes them trivial to reverse. Anybody at home could undo this with GIMP and a second or two.

      In other words, this is snake oil.