New Zealand MP Laura McClure bravely exposed the dangers of deepfake technology by holding up a manipulated, naked image of herself in Parliament, highlighting the need for legislative change to protect victims of deepfake abuse.
I am not saying this shouldn’t be illegal, as distributing it hurts the victim, but CSAM is proof that a child has been abused. I think those are two very different things.
Definitely fucking sick, but “fucking sick” is no way to run a society. The problem with child porn is that it can only be made by sexually abusing children, so without that factor you have to ask: Does AI generated child porn embolden or mollify pedophiles? Rigorous scientific research is necessary to produce an answer to that question, not kneejerk reactions.
AI generated anything relies on training data based on the real thing, so there’s no way to use a generator to “ethically” produce images of something unethical because it’s based on the unethical imagery. There’s no pathway out of the original abuse.
I would disagree. In the same vein as Nazi and imperial Japanese scientific experiments were poured over and used to further our understanding of human anatomy and the limits of the body as well as a host of other things. The original experiments were horrific to the extreme but it happened and simply destroying that data would help no one.
Those children have already been abused. That material already exists. Would it not make sense to use it to make a program that fulfills the desires of those who would do that sort of thing so that others do not need to be abused to produce it? It wouldn’t end it outright, but it seems like it would help.
People have been doing this without AI for a very long time, even before computers were a thing. All AI did was make it easy.
Genie is out of the bottle. All we can do now is make it illegal to possess or distribute, the same way we handle CSAM.
I am not saying this shouldn’t be illegal, as distributing it hurts the victim, but CSAM is proof that a child has been abused. I think those are two very different things.
There’s also AI generated CSAM now, and can be generated via prompt loopholes. Still fucking sick if you ask me…
Definitely fucking sick, but “fucking sick” is no way to run a society. The problem with child porn is that it can only be made by sexually abusing children, so without that factor you have to ask: Does AI generated child porn embolden or mollify pedophiles? Rigorous scientific research is necessary to produce an answer to that question, not kneejerk reactions.
AI generated anything relies on training data based on the real thing, so there’s no way to use a generator to “ethically” produce images of something unethical because it’s based on the unethical imagery. There’s no pathway out of the original abuse.
I would disagree. In the same vein as Nazi and imperial Japanese scientific experiments were poured over and used to further our understanding of human anatomy and the limits of the body as well as a host of other things. The original experiments were horrific to the extreme but it happened and simply destroying that data would help no one.
Those children have already been abused. That material already exists. Would it not make sense to use it to make a program that fulfills the desires of those who would do that sort of thing so that others do not need to be abused to produce it? It wouldn’t end it outright, but it seems like it would help.