A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.
The Lemmy circlejerk is real, but excusing deep fake porn is pretty off brand for us. I’m glad the comments on this post are uniformly negative.
https://sh.itjust.works/comment/10397565
https://kbin.social/m/[email protected]/t/927248/-/comment/5921190 just accept it as a new normal, it’s fine. Can’t possible have any recourse, just accept it women of the world, it’s the new normal!
Okay, there are a couple of douche canoes, but generally speaking, I think we’re okay on this one.
It is massively upvoted (for lemmy).