Isn’t CSAM classed as images and videos which depict child sexual abuse? Last time I checked written descriptions alone did not count, unless they were being forced to look at AI generated image prompts of such acts?
That month, Sama began pilot work for a separate project for OpenAI: collecting sexual and violent images—some of them illegal under U.S. law—to deliver to OpenAI. The work of labeling images appears to be unrelated to ChatGPT.
This is the quote in question. They’re talking about images
Isn’t CSAM classed as images and videos which depict child sexual abuse? Last time I checked written descriptions alone did not count, unless they were being forced to look at AI generated image prompts of such acts?
This is the quote in question. They’re talking about images