• SippyCup@lemmy.ml
    link
    fedilink
    arrow-up
    22
    ·
    3 days ago

    Listen. I know. I know how it sounds.

    I promise I’m not chronically online. LLMs are the most current end result of a predatory VC industry that has not let ethics slow it down in the past. Why would ethics slow them down now?

    • theunknownmuncher@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      29
      ·
      3 days ago

      Bro they’re just pocketing investor cash by hyping the latest tech snakeoil on a level never before seen.

      Were NFTs going to kill people too? 🙄

      • plateee@piefed.social
        link
        fedilink
        English
        arrow-up
        26
        arrow-down
        1
        ·
        edit-2
        3 days ago

        I don’t think the OP is saying we’ll have LLM Terminators walking around saying, “I will be back!” while mowing down people.

        But deaths attributed to LLMs have already started:

        Sure, this seems to be just playing on existing mentally ill people, so let’s ignore them for now.

        What about “AI” pointing the gun and having someone else pull the trigger?

        What happens when a trigger happy cop gets a dangerous sounding alert and decides to go in guns blazing?

        I don’t believe for a minute that the military isn’t looking at weaponizing this detection by also having AI “decide” if it should attack.

        Then there’s the AI in robo taxis - surely those are 100% accurate. Well they had better be - car deaths are the the highest cause of accidental deaths in the US.

        I could go on about how AI hallucinating in medical notes taken by copilot could lead to improper prescriptions, or how AI hallucinating could do things like tell people to put glue on pizza.

        I mean, I guess some outcomes are just injury not death so…