• Phoenixz@lemmy.ca
    link
    fedilink
    arrow-up
    41
    arrow-down
    4
    ·
    2 days ago

    I fully understand this girl and I wish her well but this is a genie that left a bottle that it will never again get back in to, I’m afraid

    • kittenzrulz123@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      1
      ·
      2 hours ago

      We can push this so far underground that the only people who use this are 4chan creeps on the dark web, we cant destroy AI but we can push it to the fringes

    • Bane_Killgrind@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      7
      ·
      edit-2
      2 days ago

      Push it on to credit card processors, webhost operators, domain registrars etc proceeds of crime.

      Edit: they are getting money from somewhere to run the computers these models run on.

      • WolfLink@sh.itjust.works
        link
        fedilink
        arrow-up
        4
        ·
        edit-2
        5 hours ago

        I don’t want credit card processor being the judge of what I can spend my money on or domain registrars being the judge of what websites I can visit.

        The person who did a crime should be taken to court and all the intermediaries should have the excuse of being neutral.

      • Clent@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        17
        ·
        2 days ago

        There is no way back on that.

        I can run these models on my local machine. It’s not even a complex model.

        This lawsuit is targeting the profiteers because that’s the only reasonable recourse for an individual.

        The criminal side of things is something a prosecutor needs to handle. Making this a priority becomes a political situation because it requires specific resources.

        • Son_of_Macha@lemmy.cafe
          link
          fedilink
          English
          arrow-up
          3
          ·
          16 hours ago

          Maybe we need to start pointing out it didn’t make people naked, it just fits a naked body it saw in training under the person’s head. It’s Photoshop but faster.

          • Clent@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            2
            ·
            16 hours ago

            Not exactly. Head swaps have been a thing for a while.

            These models match the body shape. They are essentially peeling back the layers of clothing. The thinner those layers the more accurate it can be.

        • Bane_Killgrind@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          2 days ago

          Right, and the people disseminating and hosting the tools tailored to criminal harassment should be held accountable, and the people hosting the resulting images. All of these people have their own revenue that can and should be disrupted.

          • ozymandias@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            6
            ·
            1 day ago

            they’re not really tailored to that… also, it wasn’t hard to photoshop naked pictures that long ago.
            but now these “tools” are neural net models… there are thousands of them hosted on dozens of source code repositories… and like op said, you can run it on any high end gaming gpu.
            you can’t outlaw source code like that.
            you could sue this one app maker and try to require that they prove consent and detect underage photos… totally a good idea, but it would do little to stop it…
            they’ll just use a different app
            i think they could prosecute the other people making and distributing the pictures though.

            • Don Piano@feddit.org
              link
              fedilink
              arrow-up
              3
              ·
              18 hours ago

              I can easily get a rock from the woods, but if I use one to break someone’s fingers, I should be prosecuted.

              Except yeah, in this case we should go after the entire chain: AI trainers, hosters, users and their supplementary industries.

              • ozymandias@lemmy.dbzer0.com
                link
                fedilink
                arrow-up
                1
                arrow-down
                1
                ·
                17 hours ago

                not trainers…
                people who host services that advertise “make any photo naked!” and have little to no safeguards against non-consensual or underage images, yes….
                i mean, if they train the model ON child porn, then yes… but otherwise it’s not the model trainer’s fault.