Taylor Swift is living every woman’s AI porn nightmare — Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.::Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.

  • MJKee9@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    3
    ·
    edit-2
    11 months ago

    That’s not their point and you know it. Get your bad faith debating tactics out of here.

    She isn’t living “every woman’s nightmare” because a woman without the wealth and influence Taylor has might actually suffer significant consequences. For Taylor, it’s just a weird Tuesday. For an average small town lady, it might mean loss of a job, loss of mate, estrangement from family and friends… That’s a nightmare.

    • GilgameshCatBeard@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      7
      ·
      edit-2
      11 months ago

      So she’s less a victim because she’s wealthy? My god you people can justify anything, can’t you?

      • Tangent5280@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        2
        ·
        11 months ago

        That is exactly it. She will suffer less compared to someone else this might have happened to, an dif you define victimhood on a spectrum, she’s less victim than Housewife Community leader preschool teacher Margaret from Montana.

        • GilgameshCatBeard@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          12
          ·
          11 months ago

          Gross dude. Very gross. Blocking you now as someone who thinks the wealthy can’t be victimized can’t possibly have anything of value to contribute.

          Do better.

      • MJKee9@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        11 months ago

        You just keep shifting your argument to create some sort of sympathy. I guess. No one says a rich person isn’t a victim. The point is is being a victim as a wealthy and influential woman like Taylor is a lot different than being a victim in a working class context. If you disagree with that, then you’re either being intellectually dishonest or living in a dream world.

        Even the law agrees. It’s a lot harder as a celebrity to win a defamation lawsuit than it is being a normal person. You typically have to show actual malice. Frankly, that’s the legal standard that would probably apply to any lawsuit involving the deep fakes anyway.

            • GilgameshCatBeard@lemmy.ca
              link
              fedilink
              English
              arrow-up
              1
              ·
              11 months ago

              So, creating nude AI deepfakes isn’t a crime? Then there’s no victims at all. What’s everyone talking about then?

              • MJKee9@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                11 months ago

                It can’t be a crime unless there is a criminal statute that applies. See if you can find one thst applies.

                  • MJKee9@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    arrow-down
                    1
                    ·
                    11 months ago

                    Your response doesn’t logically respond to my comment. It attempts to reframe the argument by setting up a “strawman,” and shows that you fail to understand (or choosing to ignore because it doesn’t support your new reframed argument) the difference between civil and criminal law in the United States.