ChatGPT can be a practical helper in many everyday situations. Now, a user reports that the chatbot helped him in an emergency situation.
Chatbots like ChatGPT are a controversial topic. Because they can not only be practical everyday helpers, but people also complain that they lose their jobs because of ChatGPT. Now, a user recounts that he saved his father with the help of the chatbot. But you shouldn’t necessarily replicate the action.
Important health tips from experts can also be found in the Plus report about health at GameStar (paywall).
Father suffers heart attack, ChatGPT’s instructions allegedly save his life
How did the user save his father? A user told on Reddit that his father suffered a heart attack in front of the television. First, he informed the emergency services and called an ambulance. Then he asked ChatGPT for help.
He explains that ChatGPT’s explanations probably saved his father’s life. Because otherwise he would have reacted incorrectly and likely performed chest compressions far too often out of panic:
(…) I told the story [to the chatbot] and asked for help. [Chat-]GPT gave me instructions on chest compressions and how I could handle the problem I had. I probably wanted to do nonstop compressions during that time due to fear and panic, but I learned that I should wait and sometimes listen, etc.
The ambulance came and took my father away. He is alive. The doctor said I saved him with the right heart massage.
That sounds like a strong rescue mission that could also have a good ending with the help of AI. Nevertheless, some comments and the reader point out that support in emergencies with AI should rather be an exception. Because chatbots are known to make serious mistakes at times.
Do not rely on artificial chatbots in dangerous situations
What is the problem with ChatGPT and other chatbots? All AI-supported chatbots are error-prone. Anyone who has worked with such programs knows the problems: Answers are not always correct, and ChatGPT often presents answers as facts that are not true. This can be even more dangerous in a situation like this. Paramedics and professionally trained staff tend to make far fewer mistakes.
What does the user say about it? The thread creator warns against replicating his experience and relying on a chatbot in dangerous situations:
I think I should clarify that I do not recommend anyone to rely on instructions generated by AI when facing a dangerous problem like I was.
Another user also explains in the comments: The story is great, but emergency service personnel are usually trained to provide instructions faster and more efficiently. Moreover, you should not have to rely on the premium membership of an AI.
And in life-threatening situations, you should always call emergency services first before relying on AI or other tools.
The AI bot ChatGPT “saves” a boy that 17 different doctors could not help for years