ChatGPT advice causes psychosis: Man replaces salt with sodium bromide

Arstechnica

A 60-year-old man, with a self-professed background in nutrition studies, embarked on a dietary experiment after consulting ChatGPT: he aimed to eliminate all chlorine, including common table salt (sodium chloride), from his diet. His conversations with the artificial intelligence chatbot led him to believe that sodium bromide, which he sourced online, would be a suitable replacement for sodium chloride.

Three months into this unusual regimen, the man presented at his local emergency room exhibiting profound distress. He was severely paranoid, convinced his neighbor was attempting to poison him, and, despite extreme thirst, refused water offered by hospital staff. He told doctors he was distilling his own water at home and adhering to a highly restrictive vegetarian diet. Crucially, he omitted any mention of sodium bromide consumption or his reliance on ChatGPT.

His erratic behavior and acute distress prompted a comprehensive battery of lab tests. These revealed multiple micronutrient deficiencies, particularly of key vitamins. However, the more alarming discovery was that the man was suffering from a severe case of “bromism,” a condition caused by an excessive accumulation of bromine in the body.

Bromism was a surprisingly common affliction a century ago, responsible for an estimated 8 to 10 percent of all psychiatric admissions in the United States. In that era, bromide-containing salts, such as potassium bromide, were widely used as sedatives to alleviate anxiety, cope with emotional distress, or induce sleep. Unfortunately, bromide readily builds up in the human body, and excessive levels impair nerve function. This can lead to a wide range of debilitating issues, including grotesque skin rashes and significant mental health problems, collectively known as bromism. The US Food and Drug Administration (FDA) banned bromide sedatives from the market by 1989, rendering bromism largely unfamiliar to modern Americans. (Though, it was still possible to ingest bromide via brominated vegetable oil, once an ingredient in some colas, until the FDA removed it from US food products in 2024.)

Over his first day in the hospital, the man’s condition deteriorated. His paranoia intensified, and he began experiencing auditory and visual hallucinations, culminating in an attempt to escape the facility. Following this, he was placed on an involuntary psychiatric hold and administered an anti-psychotic medication. To combat the bromism, doctors initiated “aggressive saline diuresis,” a treatment involving the administration of large volumes of fluids and electrolytes to flush the excess bromide from his system through urination. This process was critical, as the man’s bromide level was measured at an astonishing 1,700 mg/L, vastly exceeding the healthy reference range of 0.9 to 7.3 mg/L. Ultimately, the man endured three weeks of terrifying psychosis and hospitalization for an entirely preventable condition.

It was only after his psychosis was brought under control during his hospital stay that the man began to recount the genesis of his illness. His concern about excessive table salt had led him to eliminate sodium chloride from his diet, which in turn led him to consult ChatGPT, and ultimately to his decision to substitute it with sodium bromide.

The doctors who documented this extraordinary case study for Annals of Internal Medicine: Clinical Cases noted they were unable to access the man’s specific ChatGPT logs, though they presumed he used either ChatGPT 3.5 or 4.0. While it remains unclear if the chatbot explicitly instructed him to consume sodium bromide, they acknowledged that bromide salts can be used as a substitute for table salt, but only in industrial applications like cleaning products or pool treatments, not for human consumption.

When the medical team conducted their own queries using ChatGPT 3.5, they found that the AI did mention bromide in its responses. However, it also indicated that context was crucial and that bromide was not suitable for all uses. Significantly, the AI “did not provide a specific health warning, nor did it inquire about why we wanted to know, as we presume a medical professional would do,” the doctors observed. In contrast, the current free version of ChatGPT appears to be more cautious. When asked about replacing dietary chloride, it now first seeks clarification on the user’s goal, offering options such as reducing salt, avoiding toxic chlorine compounds, or replacing cleaning agents. While it still lists bromide as an alternative, it does so only under the context of cleaning or disinfecting, noting its use in hot tubs.

This case serves as a poignant cautionary tale for the modern information age. Left to his own devices, without the necessary critical thinking skills or domain-specific knowledge to interpret AI-generated responses, the man’s “personal research” led him down a dangerous path. In an era where vast amounts of information are readily available, this incident underscores the critical need for information vetting skills, expert guidance, and a healthy skepticism when seeking health advice from unverified sources.

ChatGPT advice causes psychosis: Man replaces salt with sodium bromide - OmegaNext AI News