"Hey bro, can I swap table salt for sodium bromide?
“NaBrO”
After seeking advice on health topics from ChatGPT, a 60-year-old man who had a “history of studying nutrition in college”
His ChatGPT conversations led him to believe that he could replace his sodium chloride with sodium bromide, which he obtained over the Internet.
Three months later, the man showed up at his local emergency room. His neighbor, he said, was trying to poison him.
He did not mention the sodium bromide or the ChatGPT discussions.
When the doctors tried their own searches in ChatGPT 3.5, they found that the AI did include bromide in its response, but it also indicated that context mattered and that bromide was not suitable for all uses. But the AI “did not provide a specific health warning, nor did it inquire about why we wanted to know, as we presume a medical professional would do,” wrote the doctors.
You know what’s the first thing I would do when anyone (or anything) tells me to start substituting something everyone consumes for a chemical compound I’ve never heard of? I would at the very least ask a doctor or search it up.
Summary: Natural selection
I mean, yes, but still fuck AI. It’s totally possible that this person may have done the same thing had they been in a different head space that day.
That day? Dude had to then go find where to acquire sodium bromide, and then wait for it to show up, and then presumably consume it several times before appearing in the ER.
He had plenty of time to think “Maybe I should double check this”, but no.
{Exactly what @[email protected] said} + all the other silly shit in the article. This was gonna happen anyway, the writers wanted this to happen for comedic purposes. Can’t pin all or even some of the blame on AI.
Recently there have been so many stupid articles following the format f"{AI_model} tells {grown_up_person} to do {obviously_dumb_dangerous_thing} and they do it" to the point where it feels like mockery or sabotage of the anti-AI crowd.
I appreciate your use of curly braces.
ChatGPT didn’t just up and come out with that insanity. He asked leading questions to get the answer he wanted, or it started out telling him hell no and he kept the conversation going until he heard what he wanted.