• AnarchistArtificer@slrpnk.net
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    1 day ago

    Often times when these deep fake nudes are being generated, the most significant real world harm comes from what happens when they get circulated around. I know a teacher at a school where this was an issue, and the girl who was a victim of this was actually interviewed by the police because if it had been a genuine image, then she could’ve been charged with creating CSAM.

    The image had been shared around the school, and the girl in question felt humiliated, even though it wasn’t her real body — if everyone thinks it’s you in the image, then it’s hard to fight that rumour mill. As to why she cared about this, well even if you, as an individual, try really hard to not care, it turns out that a lot of people do care. A lot of people called her a slut for taking such provocative images of herself, even if that’s not actually what happened.

    This goes beyond the deep fake side of things. I know someone whose ex distributed nudes that she had sent to him (revenge porn, basically), and it led to her being fired from her job. The problem here is that it’s not always the individual whose nudes (or faked nudes) are shared who has the biggest problem with that person being seen naked.

    You’re free to think about people naked as much as you like. Hell, if you wanted to generate deepfake nudes, that’d be unethical as hell in my view, but there’s little that could be done to stop you. Do whatever you like in the privacy of your own mind, but if people are getting weirded out, then that suggests that it wasn’t something that stayed contained within one person’s mind.

    • lmmarsano@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 hours ago

      A lot of people called her a slut for taking such provocative images of herself, even if that’s not actually what happened.

      I detect an obvious, unethical solution.

      I know someone whose ex distributed nudes that she had sent to him (revenge porn, basically), and it led to her being fired from her job.

      Seems like a failure of society & maybe an opportunity to shakedown a former employer for a lawsuit payout.

    • gandalf_der_12te@discuss.tchncs.de
      link
      fedilink
      arrow-up
      2
      ·
      18 hours ago

      The image had been shared around the school, and the girl in question felt humiliated, even though it wasn’t her real body — if everyone thinks it’s you in the image, then it’s hard to fight that rumour mill. As to why she cared about this, well even if you, as an individual, try really hard to not care, it turns out that a lot of people do care. A lot of people called her a slut for taking such provocative images of herself, even if that’s not actually what happened.

      This goes beyond the deep fake side of things. I know someone whose ex distributed nudes that she had sent to him (revenge porn, basically), and it led to her being fired from her job. The problem here is that it’s not always the individual whose nudes (or faked nudes) are shared who has the biggest problem with that person being seen naked.

      Okay then the logical step to take is to educate the population about the possibility of nude images being AI generated.

      • nwtreeoctopus@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        17 hours ago

        Sure. We already do that some. But we do the same basic thing around not believing rumors and that doesn’t obliviate the harm they do.

        Putting aside the issue of how many people want the rumor to be true or the deep fakes to be real, people expending effort to say/produce something harmful or uncomfortable is hurtful to the subject/victim. The idea that people could believe it is hurtful.

        This is all exacerbated with young people because their brains are wired to care more about peer socialization and perception than adult brains.

        Even things we know aren’t true damage our reputations and perceptions. I know JD Vance didn’t fuck a couch, but it’s one of the first things that comes to mind when he’s mentioned.

        Education about the reality of AI generated nudes isn’t a bad thing (and, like, every teen already knows this is a thing, anyway), but that doesn’t stop the harm for the subject due to the association with the material.

    • moopet@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      17 hours ago

      I think this is going to change radically in the near future when people switch to assuming everything they see isn’t real unless there’s solid evidence. At the moment there’s a lot of the population that assumes ai images are real, and that’s going to flip at some point