Explore more publications!

Study Shows AI Chatbots Can Be Dangerous for Health Advice

(MENAFN) Relying on artificial intelligence (AI) chatbots for medical advice can be “dangerous,” according to a recent study reported on Tuesday.

The research, conducted by the Oxford Internet Institute and the Nuffield Department of Primary Care Health Sciences at the University of Oxford, found that using AI for medical decision-making poses risks due to its “tendency to provide inaccurate and inconsistent information.”

Rebecca Payne, a co-author of the study and a general practitioner, said, “Despite all the hype, AI just isn’t ready to take on the role of the physician.” She added, “Patients need to be aware that asking a large language model about their symptoms can be dangerous, giving wrong diagnoses and failing to recognize when urgent help is needed.”

In the study, nearly 1,300 participants were asked to determine possible health conditions and suggest next steps in different scenarios. Some used large language model software to obtain potential diagnoses, while others consulted a GP or used traditional methods.

Researchers found that AI often produced a “mix of good and bad information” that users had difficulty distinguishing. While chatbots “excel at standardized tests of medical knowledge,” the study concluded that their real-world application “would pose risks to real users seeking help with their own medical symptoms.”

Lead author Andrew Bean noted that interacting effectively with humans remains “a challenge” even for top AI systems and expressed hope that the findings would guide safer development of such tools.

MENAFN12022026000045017640ID1110731100


Legal Disclaimer:
MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.

Legal Disclaimer:

EIN Presswire provides this news content "as is" without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the author above.

Share us

on your social networks:
AGPs

Get the latest news on this topic.

SIGN UP FOR FREE TODAY

No Thanks

By signing to this email alert, you
agree to our Terms & Conditions