• dev_null@lemmy.ml
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    5 months ago

    Sure, but isn’t the the perpetrator the company that trained the model without their permission? If a doctor saves someone’s life using knowledge based on nazi medical experiments, then surely the doctor isn’t responsible for the crimes?

    • PotatoKat@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      4
      ·
      5 months ago

      So is the car manufacturer responsible if someone drives their car into the sidewalk to kill some people?

      Your analogy doesn’t match the premise. (Again assuming there is no csam in the training data which is unlikely) the training data is not the problem it is how the data is used. Using those same picture to generate photos of medieval kids eating ice cream with their family is fine. Using it to make CSAM is not.

      It would be more like the doctor using the nazi experiments to do some other fucked up experiments.

      (Also you posted your response like 5 times)

      • dev_null@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 months ago

        Sorry, my app glitched out and posted my comment multiple times, and got me banned for spamming… Now that I got unbanned I can reply.

        So is the car manufacturer responsible if someone drives their car into the sidewalk to kill some people?

        In this scenario no, because the crime was in how someone used the car, not in the creation of the car. The guy in this story did commit a crime, but for other reasons. I’m just saying that if you are claiming that children in the training data are victims of some crime, then that crime was committed when training the model. They obviously didn’t agree for their photos to be used that way, and most likely didn’t agree for their photos to be used for AI training at all. So by the time this guy came around, they were already victims, and would still be victims if he didn’t.

        • PotatoKat@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 months ago

          I would argue that the person using the model for that purpose is further victimizing the children. Kinda like how with revenge porn the worst perpetrator is the person who uploaded the content, but every person viewing it from there is furthering the victimization. It is mentally damaging for the victim of revenge porn to know that their intimate videos are being seen/sought out.