The University of Washington described the case when a man received poisoning and psychosis due to the fact that ChatGPT followed the nutrition. He used sodium bromide for three months, believing that he would reduce the harm from salt. As a result, he developed bromine intoxication-a substance that was previously used in medicine, but then refused due to its toxicity.
The man went to the hospital with paranoia, hallucinations and the refusal to drink water. Doctors diagnosed him with psychosis and placed in a psychiatric clinic. After treatment with drugs and droppers, the patient’s condition improved, and after three weeks he was discharged.
According to the doctors, ChatGPT most likely incorrectly interpreted the request, advising to replace chloride (ordinary salt) with bromide, without warning about the dangers.
News -in -law materials cannot be equated to the doctor’s prescription. Before making a decision, consult a specialist.