I could go on about how AI hallucinating in medical notes taken by copilot could lead to improper prescriptions, or how AI hallucinating could do things like tell people to put glue on pizza.
I mean, I guess some outcomes are just injury not death so…
Every member of staff at the Department of War has been told to start relying on AI. A system that consistently hallucinates. Saying it’s going to get people killed is not hyperbole, as much as I wish it was. :/
Bro they’re just pocketing investor cash by hyping the latest tech snakeoil on a level never before seen.
Were NFTs going to kill people too? 🙄
I don’t think the OP is saying we’ll have LLM Terminators walking around saying, “I will be back!” while mowing down people.
But deaths attributed to LLMs have already started:
Sure, this seems to be just playing on existing mentally ill people, so let’s ignore them for now.
What about “AI” pointing the gun and having someone else pull the trigger?
What happens when a trigger happy cop gets a dangerous sounding alert and decides to go in guns blazing?
I don’t believe for a minute that the military isn’t looking at weaponizing this detection by also having AI “decide” if it should attack.
Then there’s the AI in robo taxis - surely those are 100% accurate. Well they had better be - car deaths are the the highest cause of accidental deaths in the US.
I could go on about how AI hallucinating in medical notes taken by copilot could lead to improper prescriptions, or how AI hallucinating could do things like tell people to put glue on pizza.
I mean, I guess some outcomes are just injury not death so…
https://lemmy.world/post/40041447
Every member of staff at the Department of War has been told to start relying on AI. A system that consistently hallucinates. Saying it’s going to get people killed is not hyperbole, as much as I wish it was. :/