• CarbonIceDragon@pawb.social
    link
    fedilink
    arrow-up
    11
    ·
    3 days ago

    That logic would make sense if we were talking about, say, someone writing or drawing or animating a fictitious depiction of rape or something. To my understanding though the controversy here is the AI being used to produce images of real people in a sexualized context, or transform images of them into that, which isnt quite the same thing as a depiction of a fictional character (or for that matter, a portrayal of a fictional act by consenting actors). The reason its getting called sexual abuse content isnt so much that the images are pictures of sexual abuse, but more the notion that the creation of the images is a form of sexual abuse, because the people depicted did not consent to be portrayed that way.

    • kbal@fedia.io
      link
      fedilink
      arrow-up
      6
      ·
      2 days ago

      To be more precise, the reason it’s getting called “sexual abuse” is that this has proven effective in affecting people’s emotions and forestalling any annoyingly controversial philosophical nitpicking about whether or not creating images resembling some real person counts as abuse of that person even if they know nothing about it. Since this form of it is a new thing we don’t have a better name for it yet.