From the “shaking my head” file:
I read this week that a 60-year-old man spent three weeks being treated at a hospital after replacing table salt with sodium bromide following consultation with a popular artificial intelligence chatbot.
Seriously?
The story, from NBC News, said three physicians published a report on the matter in the Annals of Internal Medicine earlier this month. According to the report, the man had no prior psychiatric history when he arrived at the hospital “expressing concern that his neighbor was poisoning him.”
It would be easy to dismiss the guy as clueless, but in an age where people so easily accept whatever appears on the screens in front of them, we have to consider his delusion as a symptom of a societal problem rather than a cause.
To be fair, there are signs he marched to the beat of a different drum before his medical scare.
Again, according to the NBC story, the man said he had been distilling his own water at home and seemed “paranoid” about water he was offered.
Doctors ran tests and found bromism, or high levels of bromide, after a lab report and consultation with poison control, the report said.
“In the first 24 hours of admission, he expressed increasing paranoia and auditory and visual hallucinations, which, after attempting to escape, resulted in an involuntary psychiatric hold for grave disability,” the case report said.
Fortunately, they were able to save him from himself.
Once his condition improved, the man said he had taken it upon himself to conduct a “personal experiment” to eliminate table salt from his diet after reading about its negative health effects. The report said he did this after consulting with ChatGPT, an artificial intelligence bot. He said the replacement went on for three months.
The basis of this is understandable. Some health professionals have demonized the use of salt and, for some people, its use presents actual problems, such as hypertension.
But why would anyone blindly accept the answer coughed up by a technology known to be buggy, not verifying its accuracy and safety?
The three physicians, all from the University of Washington, noted in the report that they did not have access to the patient’s conversation logs with ChatGPT. So, they asked ChatGPT 3.5 on their own about replacements for chloride, a component of the salt molecule.
According to the report, the response they received included bromide.
“Though the reply stated that context matters, it did not provide a specific health warning, nor did it inquire about why we wanted to know, as we presume a medical professional would do,” the report said.
That’s where an application of human intelligence would have come in handy. A reasonable person would have looked up bromide to determine if it was, indeed, a suitable substitute for salt.
A representative for OpenAI, the company that created ChatGPT, did not immediately respond to a request for comment. The company noted in a statement to Fox News that its terms of service state that the bot is not to be used in the treatment of any health condition, according to the NBC story.
“We have safety teams working on reducing risks and have trained our AI systems to encourage people to seek professional guidance,” the statement said.
On its face, that seems a bit cowardly. A bit more introspection makes me say it’s fair. People need to use some discernment in most areas of living. Why would this be different?
Just in case you’re considering increasing your bromide intake, please consider this information from the NBC story:
Bromide toxicity was a more common toxic syndrome in the early 1900s, the report said, as it was present in a number of over-the-counter medications. It was believed to contribute to 8% of psychiatric admissions at the time, according to the report.
It’s a rare syndrome but cases have re-emerged recently “as bromide-containing substances have become more readily available with widespread use of the internet,” the report said.
Just as sodium bromide is not a substitute for salt, artificial intelligence is not a substitute for human intelligence. Not yet, anyway.