• Atomic@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    22
    ·
    18 hours ago

    That is a lot of text for someone that couldn’t even be bothered to read the first paragraph of the article.

    Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content.

    There ARE victims, lots of them.

    • unexposedhazard@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      15
      ·
      edit-2
      17 hours ago

      That is a lot of text for someone that couldn’t even be bothered to read a comment properly.

      Non-consensual porn victimises the person being depicted

      This is still true if the porn in question is machine-generated

    • Lvxferre [he/him]@mander.xyz
      link
      fedilink
      English
      arrow-up
      7
      ·
      17 hours ago

      That is a lot of text for someone that couldn’t even be bothered to read the first paragraph of the article.

      Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content.

      There ARE victims, lots of them.

      You’re only rewording what I said in the third paragraph, while implying I said the opposite. And bullshitting/assuming/lying I didn’t read the text. (I did.)

      Learn to read dammit. I’m saying this shit Grok is doing is harmful, and that people ITT arguing “is this CSAM?” are missing the bloody point.

      Is this clear now?

      • Atomic@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        7
        ·
        16 hours ago

        Yes, it certainly comes across as you arguing for the opposite since you above, reiterated

        The real thing to talk about is the presence or absence of a victim.

        Which has never been an issue. It has never mattered in CSAM if it’s fictional or not. It’s the depiction that is illegal.

        • Lvxferre [he/him]@mander.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          5 hours ago

          Yes, it certainly comes across as you arguing for the opposite

          No, it does not. Stop being a liar.

          Or, even better: do yourself a favour and go offline. Permanently. There’s already enough muppets like you: assumptive pieces of shit lacking basic reading comprehension, but still eager to screech at others — not because of what the others actually said, but because of what they assumed over it. You’re dead weight in any serious discussion, probably in some unserious ones too, and odds are you know it.

          Also, I’m not wasting my time further with you, go be functionally illiterate elsewhere.