• Clbull@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Isn’t CSAM classed as images and videos which depict child sexual abuse? Last time I checked written descriptions alone did not count, unless they were being forced to look at AI generated image prompts of such acts?

    • Strawberry@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      1 year ago

      That month, Sama began pilot work for a separate project for OpenAI: collecting sexual and violent images—some of them illegal under U.S. law—to deliver to OpenAI. The work of labeling images appears to be unrelated to ChatGPT.

      This is the quote in question. They’re talking about images