• KelvarCherry@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    20
    ·
    5 hours ago

    Did Covid-19 make everyone lose their minds? This isn’t even about being cruel or egotistical. This is just a stupid thing to say. Has the world lost the concept of PR??? Genuinely defending 𝕏 in the year 2026… for Deepfake porn including of minors??? From the Fortnite company guy???

    • pulsewidth@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      4 hours ago

      Unironically this behaviour is just “pivoting to a run for office as a Republican” vibes nowadays.

      Its no longer even ‘weird behaviour’ for a US CEO.

  • SpaceCowboy@lemmy.ca
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    5 hours ago

    Who else just did a search on the Epstein files for “Tim Sweeney”?

    I didn’t find anything on jmail, but there’s still a lot that haven’t been released, and a lot of stuff is still redacted.

  • MrSulu@lemmy.ml
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    2
    ·
    edit-2
    6 hours ago

    Pedo and facist defendant Tim Sweeney. Burn his business down by disconnecting your, patronage, money, time, etc. Boards can find CEOs who are not supporters of Pedo and Facists

  • Sunsofold@lemmings.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    8 hours ago

    I’m no fan of banning this or that particular platform (it’s like trying to get rid of cheeseburgers by banning McDonalds; the burgers are still available from all the other burger chains and all the people who use the one will just switch to others) but this is a hilariously wrong way to get to the right answer.

  • Grass@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    7
    ·
    8 hours ago

    This is almost as sus as the the specific preferred age range terminology for pedophiles that comes up now and again in the most uncomfortable of scenarios

  • MehBlah@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    2
    ·
    5 hours ago

    This guy needs to be gimped and fucked by zombie epstein with vance warming up with a recliner waiting on his turn. Can twitter make that happen?

  • criss_cross@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    10 hours ago

    The only “charitable” take I can give this is that he’s been fighting Apple and Google over store fees and the like and that he feels like if he says that Apple/Google can do this then they should be able to restrict EGS as well.

    I don’t know why CSAM AI material is the hill you’d make this point with though.

  • RememberTheApollo_@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    edit-2
    13 hours ago

    Yeah. I’m as tired of the argument that pretty much anything goes as far as free speech goes as I am of the “everything is a slippery slope when we make laws to keep people from doing harmful shit.”

    I mean what’s the required damage before people put a stop to inciteful speech and objectively harmful lies? Or making CSAM of kids using a platform like X? Germany had to kill a few million people before they decided that maybe displaying Nazi symbols and speech wasn’t a good idea. So we have a platform being used to make CSAM. What’s it going to take before someone says that this is a bad idea and shouldn’t be done? How many kids will commit suicide after being taunted and shamed for their images being used? How many is “enough”? There should be immediate action to end the means to use these tools to make porn, there’s plenty of porn available on the internet and making it from user-submitted images on a major public platform is a horrible idea, but too many make up all kinds of reasons why we can’t do that…economic, censorship, whatever.

  • Lvxferre [he/him]@mander.xyz
    link
    fedilink
    English
    arrow-up
    25
    ·
    edit-2
    16 hours ago

    IMO commenters here discussing the definition of CSAM are missing the point. Definitions are working tools; it’s fine to change them as you need. The real thing to talk about is the presence or absence of a victim.

    Non-consensual porn victimises the person being depicted, because it violates the person’s rights over their own body — including its image. Plus it’s ripe material for harassment.

    This is still true if the porn in question is machine-generated, and the sexual acts being depicted did not happen. Like the sort of thing Grok is able to generate. This is what Timothy Sweeney (as usual, completely detached from reality) is missing.

    And it applies to children and adults. The only difference is that adults can still consent to have their image shared as porn; children cannot. As such, porn depicting children will be always non-consensual, thus always victimising the children in question.

    Now, someone else mentioned Bart’s dick appears in the Simpsons movie. The key difference is that Bart is not a child, it is not even a person to begin with, it is a fictional character. There’s no victim.

    • Atomic@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      22
      ·
      16 hours ago

      That is a lot of text for someone that couldn’t even be bothered to read the first paragraph of the article.

      Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content.

      There ARE victims, lots of them.

      • unexposedhazard@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        15
        ·
        edit-2
        15 hours ago

        That is a lot of text for someone that couldn’t even be bothered to read a comment properly.

        Non-consensual porn victimises the person being depicted

        This is still true if the porn in question is machine-generated

      • Lvxferre [he/him]@mander.xyz
        link
        fedilink
        English
        arrow-up
        7
        ·
        15 hours ago

        That is a lot of text for someone that couldn’t even be bothered to read the first paragraph of the article.

        Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content.

        There ARE victims, lots of them.

        You’re only rewording what I said in the third paragraph, while implying I said the opposite. And bullshitting/assuming/lying I didn’t read the text. (I did.)

        Learn to read dammit. I’m saying this shit Grok is doing is harmful, and that people ITT arguing “is this CSAM?” are missing the bloody point.

        Is this clear now?

        • Atomic@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          6
          ·
          14 hours ago

          Yes, it certainly comes across as you arguing for the opposite since you above, reiterated

          The real thing to talk about is the presence or absence of a victim.

          Which has never been an issue. It has never mattered in CSAM if it’s fictional or not. It’s the depiction that is illegal.

          • Lvxferre [he/him]@mander.xyz
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            3 hours ago

            Yes, it certainly comes across as you arguing for the opposite

            No, it does not. Stop being a liar.

            Or, even better: do yourself a favour and go offline. Permanently. There’s already enough muppets like you: assumptive pieces of shit lacking basic reading comprehension, but still eager to screech at others — not because of what the others actually said, but because of what they assumed over it. You’re dead weight in any serious discussion, probably in some unserious ones too, and odds are you know it.

            Also, I’m not wasting my time further with you, go be functionally illiterate elsewhere.

    • brachiosaurus@mander.xyz
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      8
      ·
      17 hours ago

      It’s called being so effective at marketing and spending so much money on it that people believe you don’t do nothing.