A 60-year-old American wanted to improve his diet using ChatGPT – and ended up with a poisoning that almost no one knows today, in the emergency room.
What happened? A man wanted to eat healthier after reading about the potential downsides of regular table salt (sodium chloride). Consequently, he decided to completely eliminate salt from his diet. Since he found little helpful information, he started a sort of self-experiment and sought advice from ChatGPT. According to his account, he was advised to replace chloride with bromide – so he casually swapped sodium chloride for sodium bromide in his kitchen. The result: bromism (via acpjournals).
A disease from the 19th century
What is bromism? Bromism is a largely unknown disease today, as it mostly occurred in the 19th century. At that time, medications containing bromide salts were widespread – they were used to treat insomnia, headaches, to dampen libido, and for other medical purposes (via The American Psychiatric Publishing Textbook of Substance Abuse Treatment).
The common consequence of administering these medications was bromism, primarily recognizable through neurological and psychiatric symptoms. So was the case with the 60-year-old man, who is the subject of a case study published in the Annals of Internal Medicine (via acpjournals).
With the social and medical developments of the 20th century, the frequency of these poisoning symptoms significantly decreased. Nowadays, bromism is a very rare diagnosis.
What consequences arose from the ingestion? After about three months, he developed neurological and psychiatric symptoms: confusion, paranoia, hallucinations. According to reports, he went to the emergency room, convinced that his neighbor was poisoning him. The paranoid reactions and hallucinations reportedly required psychiatric treatment.
Only after stabilization through fluid administration and improvement of his mental state with antipsychotics was he able to inform the doctors that he had taken bromide. Additionally, he reported characteristic symptoms such as facial acne, red skin lesions, insomnia, fatigue, coordination problems, and excessive thirst – all signs that his doctors ultimately classified as typical for bromine poisoning (bromism) (via LiveScience).
With intensive fluid therapy, electrolyte balance, and cessation of bromide salts, the symptoms almost completely subsided within a few weeks, allowing the patient to be discharged shortly thereafter.
In general, it is known that bromism usually diminishes after cessation, and mental impairments are often reversible (derStandard).
What role did ChatGPT and OpenAI play?
OpenAI denies responsibility: According to the case study, the treating doctors did not have access to the patient’s original chat conversation. Journalists from derStandard and 404 Media attempted to ask similar questions to ChatGPT. The latter actually received answers where bromide appeared as a “replacement” for chloride, without a clear indication that it is not suitable for consumption (via derStandard).
LiveScience also directly inquired at OpenAI: A spokesperson referred to the terms of use, stating that AI responses do not replace professional advice and should not serve as the sole basis for health decisions (via LiveScience).
The terms of use state at least:
Our Services are not intended for use in the diagnosis or treatment of any health condition. You are responsible for complying with applicable laws for any use of our Services in a medical or healthcare context.
Eng. translation: Our services are not intended for the diagnosis or treatment of health problems. You are responsible for ensuring compliance with applicable laws when using our services in a medical or healthcare context.
Service Terms at Openai.com
The case shows how quickly a well-intentioned self-experiment with AI “help” can go wrong: An unverified and seemingly smart tip – and suddenly one finds oneself in a diagnosis that belongs more to the 19th than the 21st century. How it can also go when one asks ChatGPT for help is shown by another man who approached his fitness goals with the AI in an entirely different way – and actually lost 12 kilograms: Man is too lazy for sports, asks AI ChatGPT and loses 12 kilograms