The Rise of the AI Health Advisor
For the past year, Abi from Manchester has been using ChatGPT to help manage her health. For someone with health anxiety like her, the chatbot offers more tailored advice than a traditional internet search, feeling a bit like "chatting with your doctor."
Real-World Cases: A Double-Edged Sword
Abi's experience highlights the dual nature of AI health advice. Once, suspecting a urinary tract infection, ChatGPT suggested she consult a pharmacist based on her symptoms, and she successfully obtained antibiotic treatment. This made her feel she got care "without taking up NHS time."
However, after a hiking fall resulted in severe back pain spreading to her stomach, ChatGPT diagnosed a "punctured organ" and urged an immediate A&E visit. After a three-hour wait in the emergency department, the pain eased, and she realized she wasn't critically ill. The AI had "clearly got it wrong."
What Do Experts and Research Say?
Prof Sir Chris Whitty, Chief Medical Officer for England, has noted that the quality of AI health advice is currently "not good enough," with answers often being "both confident and wrong."
Research from the University of Oxford's Reasoning with Machines Laboratory uncovered a critical issue: When chatbots are given complete, detailed medical scenarios, their diagnostic accuracy reaches 95%. However, when 1,300 ordinary users consulted an AI through conversation (i.e., providing information gradually, potentially omitting details or getting distracted), the accuracy plummeted to 35%. This means two-thirds of the time, people received incorrect diagnoses or care advice.
One study scenario involved the life-threatening condition of a subarachnoid haemorrhage (a type of stroke). Depending on how users described their symptoms, ChatGPT's advice varied wildly: from recommending rest, hydration, and over-the-counter pain relief to urging immediate medical attention.
AI Advice vs. Traditional Internet Search
Dr. Margaret McCartney, a GP in Glasgow, points out a key difference: Using a chatbot can feel like establishing a "personal relationship," whereas a traditional internet search (e.g., visiting the NHS website) presents more context about the information's reliability. The AI-generated advice "made for you" might alter how we interpret the information.
Advice for Users
- Stay Vigilant: AI chatbots are not medical professionals, and their advice can be erroneous or incomplete.
- Completeness of Information is Crucial: The quality of an AI's diagnosis heavily depends on the accuracy and thoroughness of the information you provide.
- Not a Substitute for Professional Care: For any severe, persistent, or concerning symptoms, always consult a qualified doctor or healthcare provider.
- Utilize Trusted Resources: Authoritative health websites like the NHS site, which undergo professional review, are often more reliable sources of information.
- Use AI as a Supplementary Tool: It can be helpful for initial understanding or managing minor health issues but must never be used as a final diagnosis or treatment plan.