• Lvxferre [he/him]@mander.xyz
    link
    fedilink
    English
    arrow-up
    25
    ·
    edit-2
    3 hours ago

    IMO commenters here discussing the definition of CSAM are missing the point. Definitions are working tools; it’s fine to change them as you need. The real thing to talk about is the presence or absence of a victim.

    Non-consensual porn victimises the person being depicted, because it violates the person’s rights over their own body — including its image. Plus it’s ripe material for harassment.

    This is still true if the porn in question is machine-generated, and the sexual acts being depicted did not happen. Like the sort of thing Grok is able to generate. This is what Timothy Sweeney (as usual, completely detached from reality) is missing.

    And it applies to children and adults. The only difference is that adults can still consent to have their image shared as porn; children cannot. As such, porn depicting children will be always non-consensual, thus always victimising the children in question.

    Now, someone else mentioned Bart’s dick appears in the Simpsons movie. The key difference is that Bart is not a child, it is not even a person to begin with, it is a fictional character. There’s no victim.


    EDIT: I’m going to abridge what I said above, in a way that even my dog would understand:

    What Grok is doing is harmful, there are victims of that, regardless of some “ackshyually this is not CSAM lol lmao”. And yet you guys keep babbling about definitions?

    Everything else I said here was contextualising and detailing the above.

    Is this clear now? Or will I get yet another lying piece of shit (like @[email protected] and @[email protected]), going out of their way to misinterpret what I said?

    (I don’t even have a dog.)

    • Atomic@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      23
      ·
      20 hours ago

      That is a lot of text for someone that couldn’t even be bothered to read the first paragraph of the article.

      Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content.

      There ARE victims, lots of them.

      • unexposedhazard@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        15
        ·
        edit-2
        19 hours ago

        That is a lot of text for someone that couldn’t even be bothered to read a comment properly.

        Non-consensual porn victimises the person being depicted

        This is still true if the porn in question is machine-generated

      • Lvxferre [he/him]@mander.xyz
        link
        fedilink
        English
        arrow-up
        7
        ·
        18 hours ago

        That is a lot of text for someone that couldn’t even be bothered to read the first paragraph of the article.

        Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content.

        There ARE victims, lots of them.

        You’re only rewording what I said in the third paragraph, while implying I said the opposite. And bullshitting/assuming/lying I didn’t read the text. (I did.)

        Learn to read dammit. I’m saying this shit Grok is doing is harmful, and that people ITT arguing “is this CSAM?” are missing the bloody point.

        Is this clear now?

        • Atomic@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          8
          ·
          18 hours ago

          Yes, it certainly comes across as you arguing for the opposite since you above, reiterated

          The real thing to talk about is the presence or absence of a victim.

          Which has never been an issue. It has never mattered in CSAM if it’s fictional or not. It’s the depiction that is illegal.

          • dantel@programming.dev
            link
            fedilink
            English
            arrow-up
            1
            ·
            14 minutes ago

            Is it so hard to admit that you misunderstood the comment ffs? It is painfully obvious to everyone.

          • Lvxferre [he/him]@mander.xyz
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            6 hours ago

            Yes, it certainly comes across as you arguing for the opposite

            No, it does not. Stop being a liar.

            Or, even better: do yourself a favour and go offline. Permanently. There’s already enough muppets like you: assumptive pieces of shit lacking basic reading comprehension, but still eager to screech at others — not because of what the others actually said, but because of what they assumed over it. You’re dead weight in any serious discussion, probably in some unserious ones too, and odds are you know it.

            Also, I’m not wasting my time further with you, go be functionally illiterate elsewhere.