It’s messy legislation all around. When does it become porn vs art vs just erotic or satirical? How do you prove it was a deep fake and not a lookalike? If I use a porn actress to make a deep fake is that also illegal or is it about how the original source content was intended to be used/consumed?
I’m not saying that we should just ignore these issues, but I don’t think any of this will be handled well by any government.
That’s easy. The movie studios know what post-production went into the scenes and have the documents to prove it. They can easily prove that such clips fall under deepfake laws.
Y’all need to be more cynical. These lobby groups do not make arguments because they believe in them, but because it gets them what they want.
I was responding to an above comment. The guy who was arrested in op’s article was posting clips from movies (so not deep fakes).
That being said, for deepfakes, you’d need the original video to prove it was deepfaked. Additionally, you’d then probably need to prove they used a real person to make the deep fake. Nowadays it’s easy to make “fake” people using AI. Not sure where the law sits on creating deepfakes of fake people who resemble other people.
I didn’t make the point clear. The original scenes themselves, as released by the studio, may qualify as “deepfakes”. A little bit of digital post-processing can be enough to qualify them under the proposed bills. Then sharing them becomes criminal, fair use be damned.
Actually I was thinking about this some more and I think there is a much deeper issue.
With the advent of generative AI, photographs can no longer be relied upon as documentary evidence.
There’s the old saying, ‘pics or it didn’t happen’, which flipped around means sharing pics means it did happen.
But if anyone can generate a photo realistic image from a few lines of text, then pictures don’t actually prove anything unless you have some bulletproof way to tell which pictures are real and which are generated by AI.
And that’s the real point of a lot of these laws, to try and shove the genie back in the bottle. You can ban deep fake porn and order anyone who makes it to be drawn in quartered, you can an AI watermark it’s output but at the end of the day the genie is out of the bottle because someone somewhere will write an AI that ignores the watermark and pass the photos off as real.
I’m open to any possible solution, but I’m not sure there is one. I think this genie may be out of the bottle for good, or at least I’m not seeing any way that it isn’t. And if that’s the case, perhaps the only response that doesn’t shred civil liberties is to preemptively declare defeat, acknowledge that photographs are no longer proof of anything, and deal with that as a society.
One solution that’s been proposed is to cryptographic ally sign content. This way someone can prove they “made” the content. It doesn’t prove the content is real, but means you can verify the originator.
However, at the end of the day, you’re still stuck with needing to decide who you trust.
Probably the best idea yet. It’s definitely not foolproof though. Best you could do is put a security chip in the camera that digitally signs the pictures, but that is imperfect because eventually someone will extract the key or figure out how to get the camera to sign pictures of their choosing that weren’t taken by the camera.
A creator level key is more likely, so you choose who you trust.
But most of the pictures that would be taken as proof of anything probably won’t be signed by one of those.
I’m not fine with that, as it will have wide-ranging repercussions on society at large that aren’t all good.
But I fully accept it as the cold hard reality that WILL happen now that the genie’s out of the bottle, and the reality that any ham-fisted legal attempt to rebottle the genie will be far worse for society and only delay the inevitable acceptance that photographs are no longer proof.
And as such, I (and most other adults mature enough to accept a less-than-preferred reality as reality) stand with you and give the statists the middle finger, along with everyone else who thinks you can legislate any genie back into its bottle. In the 1990s it was the ‘protect kids from Internet porn’ people, in the 2000s it was the ‘protect kids from violent video games’ and ‘stop Internet piracy’ people, I guess today it’s the ‘stop generative AI’ people. They are all children who think crying to Daddy will remake the ways of the world. It won’t.
That’s the appropriate reaction to many of these so-called threats to society. Internet chat rooms, generative AI, drugs, opioids, guns, pornography, trashy TV, you name it. I think it’s been pretty well demonstrated throughout history that the majority of the time some ‘threat to public safety’ comes out and a well-meaning group tries to get the government to shove the genie back in the bottle, the cure ends up being worse than the disease. And it’s a lot easier to set up bureaucracy then to dismantle it.
The sad thing is, whatever regulation they set up will be pointless. Someone will download an open source model and run it locally with the watermark code removed. Or some other nation will realize that hobbling their AI industry with stupid regulations won’t help them get ahead in the world and they will become a source for non-watermarked output and watermark free models.
So we hobble ourselves with some ridiculous AI enforcement bureaucracy, and it will do precisely zero good because the people who would do bad things will just do them on offshore servers or in their basement.
It applies everywhere else too. I’m all for ending the opioid crisis, but the current attempt to end opioids entirely is not the solution. A good friend of mine takes a lot of opioids, prescribed by a doctor, for a serious pain condition resulting from a car accident. This person’s back and neck are full of metal pins and screws and plates and whatnot.
For this person, opioids like oxycontin are the difference between being in constant pain and being able to do things like workout at the gym and enjoy life.
But because of the well-meaning war on opioids, this person and their doctor are persecuted. Pharmacies don’t want to deal with oxycontin, and the doctor is getting constant flack from insurance and DEA for prescribing too much of it.
I mean really, a pain management doctor prescribes a lot of pain medication. That’s definitely something fishy that we should turn the screws on him for…
It’s really infuriating. In my opinion, the only two people who should decide what drugs get taken are a person and their doctor. For anyone else to try and intrude on that is a violation of that person’s rights.
It’s messy legislation all around. When does it become porn vs art vs just erotic or satirical? How do you prove it was a deep fake and not a lookalike? If I use a porn actress to make a deep fake is that also illegal or is it about how the original source content was intended to be used/consumed?
I’m not saying that we should just ignore these issues, but I don’t think any of this will be handled well by any government.
That’s easy. The movie studios know what post-production went into the scenes and have the documents to prove it. They can easily prove that such clips fall under deepfake laws.
Y’all need to be more cynical. These lobby groups do not make arguments because they believe in them, but because it gets them what they want.
I was responding to an above comment. The guy who was arrested in op’s article was posting clips from movies (so not deep fakes).
That being said, for deepfakes, you’d need the original video to prove it was deepfaked. Additionally, you’d then probably need to prove they used a real person to make the deep fake. Nowadays it’s easy to make “fake” people using AI. Not sure where the law sits on creating deepfakes of fake people who resemble other people.
I didn’t make the point clear. The original scenes themselves, as released by the studio, may qualify as “deepfakes”. A little bit of digital post-processing can be enough to qualify them under the proposed bills. Then sharing them becomes criminal, fair use be damned.
Same as anything else, if it causes someone harm (in american financial harm counts) it gets regulated.
There are exceptions to allow people to disregard laws as well. Its legal to execute a death row prisoner.
Actually I was thinking about this some more and I think there is a much deeper issue.
With the advent of generative AI, photographs can no longer be relied upon as documentary evidence.
There’s the old saying, ‘pics or it didn’t happen’, which flipped around means sharing pics means it did happen.
But if anyone can generate a photo realistic image from a few lines of text, then pictures don’t actually prove anything unless you have some bulletproof way to tell which pictures are real and which are generated by AI.
And that’s the real point of a lot of these laws, to try and shove the genie back in the bottle. You can ban deep fake porn and order anyone who makes it to be drawn in quartered, you can an AI watermark it’s output but at the end of the day the genie is out of the bottle because someone somewhere will write an AI that ignores the watermark and pass the photos off as real.
I’m open to any possible solution, but I’m not sure there is one. I think this genie may be out of the bottle for good, or at least I’m not seeing any way that it isn’t. And if that’s the case, perhaps the only response that doesn’t shred civil liberties is to preemptively declare defeat, acknowledge that photographs are no longer proof of anything, and deal with that as a society.
One solution that’s been proposed is to cryptographic ally sign content. This way someone can prove they “made” the content. It doesn’t prove the content is real, but means you can verify the originator.
However, at the end of the day, you’re still stuck with needing to decide who you trust.
Probably the best idea yet. It’s definitely not foolproof though. Best you could do is put a security chip in the camera that digitally signs the pictures, but that is imperfect because eventually someone will extract the key or figure out how to get the camera to sign pictures of their choosing that weren’t taken by the camera.
A creator level key is more likely, so you choose who you trust.
But most of the pictures that would be taken as proof of anything probably won’t be signed by one of those.
I’m fine with photos don’t prove anything.
Let statists cry about that one, cry little statists that you can’t inflict pain in that justified way that you love so much.
I’m not fine with that, as it will have wide-ranging repercussions on society at large that aren’t all good.
But I fully accept it as the cold hard reality that WILL happen now that the genie’s out of the bottle, and the reality that any ham-fisted legal attempt to rebottle the genie will be far worse for society and only delay the inevitable acceptance that photographs are no longer proof.
And as such, I (and most other adults mature enough to accept a less-than-preferred reality as reality) stand with you and give the statists the middle finger, along with everyone else who thinks you can legislate any genie back into its bottle. In the 1990s it was the ‘protect kids from Internet porn’ people, in the 2000s it was the ‘protect kids from violent video games’ and ‘stop Internet piracy’ people, I guess today it’s the ‘stop generative AI’ people. They are all children who think crying to Daddy will remake the ways of the world. It won’t.
I am infinitely more worried about the backlash and the enclosers than the tech itself.
That’s the appropriate reaction to many of these so-called threats to society. Internet chat rooms, generative AI, drugs, opioids, guns, pornography, trashy TV, you name it. I think it’s been pretty well demonstrated throughout history that the majority of the time some ‘threat to public safety’ comes out and a well-meaning group tries to get the government to shove the genie back in the bottle, the cure ends up being worse than the disease. And it’s a lot easier to set up bureaucracy then to dismantle it.
The sad thing is, whatever regulation they set up will be pointless. Someone will download an open source model and run it locally with the watermark code removed. Or some other nation will realize that hobbling their AI industry with stupid regulations won’t help them get ahead in the world and they will become a source for non-watermarked output and watermark free models.
So we hobble ourselves with some ridiculous AI enforcement bureaucracy, and it will do precisely zero good because the people who would do bad things will just do them on offshore servers or in their basement.
It applies everywhere else too. I’m all for ending the opioid crisis, but the current attempt to end opioids entirely is not the solution. A good friend of mine takes a lot of opioids, prescribed by a doctor, for a serious pain condition resulting from a car accident. This person’s back and neck are full of metal pins and screws and plates and whatnot.
For this person, opioids like oxycontin are the difference between being in constant pain and being able to do things like workout at the gym and enjoy life.
But because of the well-meaning war on opioids, this person and their doctor are persecuted. Pharmacies don’t want to deal with oxycontin, and the doctor is getting constant flack from insurance and DEA for prescribing too much of it.
I mean really, a pain management doctor prescribes a lot of pain medication. That’s definitely something fishy that we should turn the screws on him for…
It’s really infuriating. In my opinion, the only two people who should decide what drugs get taken are a person and their doctor. For anyone else to try and intrude on that is a violation of that person’s rights.