• ozymandias@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    6
    ·
    1 day ago

    they’re not really tailored to that… also, it wasn’t hard to photoshop naked pictures that long ago.
    but now these “tools” are neural net models… there are thousands of them hosted on dozens of source code repositories… and like op said, you can run it on any high end gaming gpu.
    you can’t outlaw source code like that.
    you could sue this one app maker and try to require that they prove consent and detect underage photos… totally a good idea, but it would do little to stop it…
    they’ll just use a different app
    i think they could prosecute the other people making and distributing the pictures though.

    • Don Piano@feddit.org
      link
      fedilink
      arrow-up
      3
      ·
      19 hours ago

      I can easily get a rock from the woods, but if I use one to break someone’s fingers, I should be prosecuted.

      Except yeah, in this case we should go after the entire chain: AI trainers, hosters, users and their supplementary industries.

      • ozymandias@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        18 hours ago

        not trainers…
        people who host services that advertise “make any photo naked!” and have little to no safeguards against non-consensual or underage images, yes….
        i mean, if they train the model ON child porn, then yes… but otherwise it’s not the model trainer’s fault.