• AmbitiousProcess (they/them)@piefed.social
    link
    fedilink
    English
    arrow-up
    100
    arrow-down
    1
    ·
    15 hours ago

    I went to her profile expecting her to be the usual brainbroken conservative, and instead she’s like, complaining about a reply getting removed because it had the F slur in it, but she also replied to one of Elon’s AI-generated videos about his Tesla robot saying “Get the fuck out of here with this clanker bullshit”, so I respect it.

    confused

    • bobs_monkey@lemmy.zip
      link
      fedilink
      arrow-up
      48
      ·
      13 hours ago

      I’ve gotta think this is satire. Any American that’s ever had a burger has to know they charge more for a cheeseburger than a regular hamburger, regardless of the cheese type.

    • Chloé 🥕@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      15
      arrow-down
      35
      ·
      14 hours ago

      there’s two types of people who use “clanker”: clueless people who hate AI and want to make a Le Epic Star Wars Reference,

      and people who think racism is funny and cool but don’t like the (shrinking, but still…) social consequences that come with racism, so they decide to be Ironically Racist™ to AI instead (see also the people saying shit like “wireback”, “screws will not replace us”, “rosa sparks”, fucking “george droyd”, etc)

      i don’t think it takes much to figure out which group OOP is a part of

      • Glide@lemmy.ca
        link
        fedilink
        arrow-up
        18
        arrow-down
        2
        ·
        edit-2
        6 hours ago

        Slurs target a marginalized group. “Clanker” does not target a marginalized group, because generative AI is not part of a marginalized group. It is not even alive, therefore it is not a slur in the sense you’re equating it to.

        Please don’t call other people “clueless” if you don’t understand the things you’re getting worked up over. Equating non-thinking computer processing models to oppressed minorities is doing far more damage than anyone using the term “clanker,” ironically or otherwise.

        • Chloé 🥕@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          9
          arrow-down
          9
          ·
          edit-2
          12 hours ago

          yea, i guess someone saying “screws will not replace us” or “13% of the code, 50% of the bugs” (both real things i’ve seen people say) means nothing bad by it. dogwhistle? never heard of that! it’s just wholesome anti AI fun!

          i hope i don’t have to explain why saying something like this is bad, right? and that it’s not about AI being a marginalized group, because of fucking course it isn’t, that’s never been the point!

          like, when you see shit like “johnny the walrus”, a book written by a far right podcaster about how boys can’t become walruses, no matter how many medical interventions they have being forced on them, and that they’ll all grow out of their “wanting to become a walrus” phase, do you think it’s really about walruses? do you think when people say “this book is transphobic” what they mean is “people wanting to become walruses are a marginalized group comparable to trans people”?

          cause it’s the same shit with clanker and other assorted nonsense, you’re making racist jokes but swapping racialized people for an acceptable target. but guess what, it’s still racism! you’re still doing a racist joke!

          • Glide@lemmy.ca
            link
            fedilink
            arrow-up
            2
            arrow-down
            1
            ·
            7 hours ago

            I think there’s a huge difference between an intentional allegory used as an attack on a marginalized group, and a word being used to durogatorily refer to a non-living, non-feeling group of machines which are actively damaging the world.

            yea, i guess someone saying “screws will not replace us” or “13% of the code, 50% of the bugs” (both real things i’ve seen people say)

            I would agree that these things are not okay, becuase they’re imitating insults that literally only exist to put forward racist ideology, and I’d tell anyone who used them around me as much.

            cause it’s the same shit with clanker and other assorted nonsense, you’re making racist jokes but swapping racialized people for an acceptable target. but guess what, it’s still racism! you’re still doing a racist joke!

            So, how about “chud” then, intended to refer to right-wing-minded hate mongers? Or, here’s a better one, how about when we call right-wing extremists nazis as a durogatory insult? I mean they’re certainly not all members of the third reich. We’re using the term to equate them to something they strictly aren’t, even if they share more ideology than is okay. We’re still using words to categorize groups of people, and using them in intentionally insulting ways. Such durogatory terms are a part of our natural language. They’re not nice, sure, but I don’t want to be nice to people who are actively calling for violence against marginalized groups. But most importantly, we don’t think of these words as slurs. Slurs are durogatory terms that target marginalized groups, unlike “chud,” “nazi,” or yes, “clanker.”

            I think there’s far too much nuance here to make blanket insinuations like “durogatory terms used to refer to things we don’t like are stand-ins for racist remarks.” But considering some of the other connections you’ve seen people make, I can certainly understand the trepidation.

          • Warl0k3@lemmy.world
            link
            fedilink
            arrow-up
            4
            arrow-down
            1
            ·
            edit-2
            9 hours ago

            Both examples are of disparaging comments directed at, you know, people. Veiled comments, sure, but pretty clearly directed at them nontheless.

            Hot take, but: AI aren’t people. As a result, comments directed at them aren’t directed at people. Dogwhistles work because they’re comments directed at a group (of people), couched in language so as to imply they are directed at something else. Do you see the difference, they’re still directed at people? And clankers are, you know, not people?

            Nobody’s defending dogwhistling, but you’re trying to imply that all negative comments that use “clanker” are dogwhistling (or somehow normalizing slurs), and you know darn well that that’s disingenuous.

      • Fredthefishlord@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        18
        ·
        14 hours ago

        That’s overly diminutive of who uses it. Especially the idea that someone has to be clueless to hate ai and want to make a start wars reference

        • Chloé 🥕@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          6
          arrow-down
          13
          ·
          13 hours ago

          i say clueless because the first group enables the second group, by normalizing the idea that saying fictional slurs is Super Cool And Good And Fun!

      • marcos@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        13 hours ago

        George Lucas: Writes a complex dystopia where programmed people invent a slur to refer to the robots that very often think more freely than them.

        George Lucas’ audience: Takes it unironically as a “good guy vs. bad guy” story and decide to completely identify with the people using the slur.

    • db2@lemmy.world
      link
      fedilink
      arrow-up
      7
      arrow-down
      22
      ·
      15 hours ago

      They’ve been using clanker to replace another *er word that starts with n. Just so you know.

        • DragonTypeWyvern@midwest.social
          link
          fedilink
          arrow-up
          1
          ·
          8 minutes ago

          Clanker->Robot->Slave/non-person

          Nazis love dogwhistles, especially ones that are popular enough. This one’s got an extra degree of separation over clown, goblin, or lizard but the connection is pretty clear.