Last month, a detective in a small town outside of Lancaster, Pennsylvania, invited dozens of high school girls and their parents to the police station to undertake a difficult task: one by one, the girls were asked to confirm that they were depicted in hundreds of AI-generated deepfake pornographic images seized by law enforcement.

In a series of back-to-back private meetings, Detective Laurel Bair of the Susquehanna Regional Police Department slid each image out from under the folder’s cover, so only the girl’s face was shown, unless the families specifically requested to see the entire uncensored image.

“It made me a lot more upset after I saw the pictures because it made them so much more real for me,” one Lancaster victim, now 16, told Forbes. “They’re very graphic and they’re very realistic,” the mother said. “There’s no way someone who didn’t know her wouldn’t think: ‘that’s her naked,’ and that’s the scary part.” There were more than 30 images of her daughter.

The photos were part of a cache of images allegedly taken from 60 girls’ public social media accounts by two teenage boys, who then created 347 AI-generated deepfake pornographic images and videos, according to the Lancaster County District Attorney’s Office. The two boys have now been criminally charged with 59 counts of “sexual abuse of children,” and 59 counts of “posession of child pornography,” among other charges, including “possession of obscene materials depicting a minor.”

  • sndmn@lemmy.ca
    link
    fedilink
    arrow-up
    50
    arrow-down
    14
    ·
    15 days ago

    Charging a child with possession of child porn seems wrong, especially when it’s AI generated imagery and not actual CSAM.

    • mosiacmango@lemm.ee
      link
      fedilink
      arrow-up
      61
      arrow-down
      4
      ·
      edit-2
      15 days ago

      At the same time, allowing 60 children to have child porn generated with their likeness and then passed around to other children and the internet at large also seems wrong, even if it’s generated CSAM instead of literally CSAM.

      I would personally consider generating porn of a child and then giving that to their peers an act of child sex assault, even when committed by other children. That needs a legal response.

      • orclev@lemmy.world
        link
        fedilink
        arrow-up
        41
        arrow-down
        6
        ·
        15 days ago

        I think the argument in this case isn’t that a crime wasn’t committed, but rather charging a minor for CSAM possession is inappropriate (particularly when the images are fake). Perhaps a different law needs to be made for these highly specific cases, as the existing CSAM laws typically carry very hefty sentences that don’t seem entirely appropriate in a case like this.

        • mosiacmango@lemm.ee
          link
          fedilink
          arrow-up
          16
          arrow-down
          9
          ·
          edit-2
          15 days ago

          Are they fake? They are the faces of real children in sexual and pornograpic images.

          I agree there should be more specific laws, but this still seems to fall under the current ones to me. These are not fully artifical CSAM, which is fucked up but has no living victim. These are sexual pictures of real children, that just have most of the sexual part generated. Thats much, much closer to full on CSAM then the above, and falls under the “spirit” of the law, which is to punish people that abuse children for sex. That is what these other children did to these 60 girls.

          • orclev@lemmy.world
            link
            fedilink
            arrow-up
            27
            arrow-down
            8
            ·
            14 days ago

            You have to admit there is a pretty fundamental difference between manipulating an otherwise legal image to look like a minor in a sexual act vs an actual photo of that same minor engaged in a sexual act. While both might be considered a crime, the damage to the victim is of a fundamentally different nature. I think there’s a strong argument that the former bears a closer relationship to slander than it does to rape.

            • mosiacmango@lemm.ee
              link
              fedilink
              arrow-up
              22
              arrow-down
              3
              ·
              edit-2
              14 days ago

              I agree the two are different, but not as different as you seem to think. None of these girls were raped, but this is still sexual abuse, especially because these images were shared.

              Sexual abuse is complex, and far surpasses “slander,” especially in ones formative years. This act of sexual abuse is going to change how 60 girls and soon to be woman respond to sex, likely for the rest of their lives. These images may follow them forever, causing heartache, job loss, on and on, and the damage will be done because this is a form of CSAM of them that is in the world.

              That is not a light matter to be sidelined to a “hand slap” level of offense. I think the fact the perpetrators were also children should play heavily in their defense, but otherwise this needs to be treated as the sexually damaging event it is.

              • Pheonixdown@lemm.ee
                link
                fedilink
                arrow-up
                11
                arrow-down
                2
                ·
                14 days ago

                Isn’t all of that still kinda of true regardless of the age of the subjects? If they were 18 or 30 it isn’t magically better.

                Revenge Porn might be a closer analogue. CSAM laws feel like they’ll get loopholed somehow, like idk if can just ask the AI to make the person aged up or whatever and get away with it.

              • randompasta@lemmy.today
                link
                fedilink
                arrow-up
                9
                arrow-down
                1
                ·
                14 days ago

                It does seem like there needs to be a new law specifically addressing this. In the past someone could have cut out the heads of a 17 year old and pasted it on top of a playboy model. That’s an obvious fake, but I don’t think it is the same as what’s happened here. But to a degree there are similarities. Does the ability to detect a fake matter? I don’t know. There are applications that can determine if a picture is AI generated with some level of confidence. Does that mean only human opinion matters? Again, I don’t know. Certainly there was no abuse at the time the image was taken, so there is a difference with this and CP.

              • orclev@lemmy.world
                link
                fedilink
                arrow-up
                6
                arrow-down
                2
                ·
                edit-2
                14 days ago

                I don’t think this is “hand slap” level, but it also isn’t multiple decades behind bars level which is what they would be looking at for that quantity of CSAM, particularly for a couple of horny teenagers that likely weren’t even sure what they were doing was illegal. I do think you’re over exaggerating somewhat the harm in this case as fundamentally what was done isn’t much different from something like cutting out photos of these girls heads and pasting them into a porn magazine. It’s certainly fancier and more convincing, but at the end of the day that’s what happened, their faces got superimposed on the bodies of porn stars. That likely bothered these girls in the same way the thought of some random creep jerking off to their original photos would, and if the images were widely circulated it could cause some issues down the line (heartache certainly, but job loss certainly not), but if this bothered them enough to alter the way they feel about sex for the rest of their lives there were already significant mental issues at play.

                I honestly don’t know exactly what an appropriate level of punishment would be. My gut says something like 6 months to a year in juvenile detention plus some years of probation. I think a significant amount of weight needs to be given to the fact that these were a couple of teenagers doing something that wasn’t obviously illegal. They cannot and should not be held to the same standards as adults would be for the same reason statutory rape is a thing, they’re incapable of reasoning about their actions to the same degree as an adult is.

              • Serinus@lemmy.world
                link
                fedilink
                arrow-up
                5
                arrow-down
                8
                ·
                edit-2
                14 days ago

                This act of sexual abuse is going to change how 60 girls and soon to be woman respond to sex, likely for the rest of their lives. These images may follow them forever

                No, it’s not. No, it shouldn’t.

                First, it’s so, so much easier to deal with when you have the response of “that’s not me”. Second, it’s current AI. How real do these things even look?

                These girls were not sexually abused. Sexual harassment is more a appropriate crime. Maybe libel. Maybe a new crime that we can call “sexual libel” or something.

                • Coskii@lemmy.blahaj.zone
                  link
                  fedilink
                  arrow-up
                  2
                  arrow-down
                  4
                  ·
                  14 days ago

                  Current AI for generating sexual images is on the real side of the uncanny valley at this point. If you’re really looking you might be able to tell, but I don’t think most people looking for porn are going to scrutinize anything too closely in the first place… So real enough.

                  However, I don’t see how 60 images of what’s effectively a face plastered on a miscellaneous body doing something sexual would follow anyone for anything. Anyone who knows of them and outs themselves just admitted to child porn…

                  Most people don’t have such unique facial features that would be something that could even follow them in the first place.

                  As for the criminal aspect of it, that’s a societal thing to figure out, so here they go figuring it out.

    • orclev@lemmy.world
      link
      fedilink
      arrow-up
      18
      ·
      15 days ago

      This has always been one of the problems with CSAM laws. There have been a number of cases now where minors were charged with CSAM possession for either naked pictures of themselves, or pictures (consensual) of their girlfriend/boyfriend who was also a minor. There’s also the broader discussion about what exactly qualifies as CSAM, with some jurisdictions going for a more maximalist approach that considers things like drawings (even highly unrealistic or stylized ones) of people or even fictional characters to be CSAM. Some jurisdictions don’t even require the photo or drawing to depict the minor naked or even engaging in a sexual act, they instead define it as pornography if the person in possession of it gets some kind of sexual gratification from it. So for instance a photo of a minor that’s fully clothed and just standing there could actually be considered CSAM.

      The problem is that it’s hard to draw hard lines about what does or doesn’t qualify without then leaving loopholes that can be exploited. This is why many jurisdictions opt for a maximalist approach and then leave it to the discretion of the police and prosecutors for what they do or do not consider, but of course that has the flaw that it’s entirely arbitrary and leaves a lot of power in the hands of prosecutors and police for something widely regarded as a extremely serious crime.

      • shani66@ani.social
        link
        fedilink
        arrow-up
        2
        arrow-down
        4
        ·
        14 days ago

        Yes. Let’s not pretend children aren’t people too, they are going to take pictures of themselves or their partners and that is both normal and illegal right now.

    • HellsBelle@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      10
      ·
      14 days ago

      So you would rather they get off unscathed instead?

      In a similar deepfake porn case at a school in New Jersey, no charges were brought, and the alleged perpetrator does not appear to have suffered any academic or legal penalties. Dorotha Mani, a mother of one of the New Jersey victims, outlined her frustration with the leadership at her daughter’s school in approximately three pages of written testimony to Congress published in March. In that document, she called Westfield High School’s tepid response as “not only disheartening but also dangerous, as it fosters an environment where female students are left to feel victimized while male students escape necessary accountability.”

      • Chozo@fedia.io
        link
        fedilink
        arrow-up
        16
        arrow-down
        2
        ·
        14 days ago

        So you would rather they get off unscathed instead?

        Why would this be the only other option?

  • Ogmios@sh.itjust.works
    link
    fedilink
    arrow-up
    21
    arrow-down
    1
    ·
    15 days ago

    I’m a bit confused about the process here. Do the victims really have any special ability to identify fake pictures that any other reasonable person couldn’t connect with them? It seems needlessly traumatizing to me.