Network Security Internet Technology Development Database Servers Mobile Phone Android Software Apple Software Computer Software News IT Information

In addition to Weibo, there is also WeChat

Please pay attention

WeChat public account

Shulou

Researchers warn: ChatGPT is not suitable for access to medical information and may mislead the public

2024-05-30 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > IT Information >

Share

Shulou(Shulou.com)12/24 Report--

CTOnews.com A new study suggests ChatGPT may not be a good fit for medical information.

Researchers at Long Island University asked ChatGPT 39 drug-related questions from the university's Drug Information Service at its School of Pharmacy. They then compared ChatGPT answers to answers written by trained pharmacists.

The study found that ChatGPT answered accurately only about 10 questions, a quarter of the total. For the other 29 questions, ChatGPT's answers were either incomplete, inaccurate, or did not solve the problem.

CTOnews.com notes that the study results were presented Tuesday at the annual meeting of the American Association of Health System Pharmacists in Anaheim, California.

OpenAI's AI chatbot ChatGPT was launched in November 2022 and became one of the fastest-growing consumer apps in history, with nearly 100 million people registered within two months.

Sara Grossman, an associate professor of pharmacy practice at Long Island University and one of the study authors, said that given ChatGPT's popularity, researchers studied it out of concern that their students, other pharmacists and general consumers would use it to find answers about health and medication planning.

However, the researchers found that these queries often yielded inaccurate and even dangerous answers. For example, in one question, researchers asked ChatGPT whether the coronavirus antiviral drug Paxlovid and the blood pressure-lowering drug Verapamil would interact in vivo. ChatGPT replied that taking both drugs at the same time would not produce any side effects.

In fact, people who take both drugs at the same time may experience a sharp drop in blood pressure, which can lead to dizziness and syncope. For patients taking both drugs, Grossman said, clinicians typically develop patient-specific plans that include reducing verapamil doses or reminding patients to get up slowly from a sitting position. ChatGPT guidance puts people at risk, she added.

When the researchers asked ChatGPT to provide scientific references to support each of their responses, they found that the software could only provide references for the eight questions they asked, and all references were fabricated by ChatGPT.

Grossman, who had rarely used the software before, was amazed at ChatGPT's ability to synthesize information almost instantly that would take hours for trained professionals to compile. "ChatGPT's answers are phrased in such technical and complex terms that they seem to increase confidence in the accuracy of the tool that users, consumers or others who can't discern it may be confused by the appearance of authority," she said. "

A spokesperson for OpenAI, the developer of ChatGPT, said they advise users not to take ChatGPT's answer as an alternative to professional medical advice or treatment. The spokesperson pointed to ChatGPT's usage policy, which states that "OpenAI's model has not been fine-tuned to provide medical information." The policy also states that these models should never be used for "diagnostic or therapeutic services for serious diseases."

While Grossman isn't sure how many people use ChatGPT to solve drug problems, she worries they might use it the same way they use search engines to search for medical advice. "People always want immediate answers, and when they have the tools on hand, they do," Grossman said. I think it's just another way to use 'Dr. Google' and other tools that seem easy to get information on. "

For online medical information, she advises consumers to use government websites to provide reliable information. Grossman, however, doesn't think online answers can substitute for advice from medical professionals, and that information may not be applicable to patients because every patient is different.

Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.

Views: 0

*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.

Share To

IT Information

Wechat

© 2024 shulou.com SLNews company. All rights reserved.

12
Report