Parents trust ChatGPT to doctors, shocking new study claims

Paging Dr. Bot.

In a new study, researchers at the University of Kansas’ Life Span Institute found that parents trust artificial intelligence (AI), such as ChatGPT, more than healthcare professionals.

“Participants found minimal differences between the vignettes written by the experts and those generated by the rapidly designed ChatGPT,” says Calissa Leslie-Miller, a doctoral student in clinical child psychology at the university and lead author of the study. “When vignettes were statistically significantly different, ChatGPT was rated as more reliable, accurate, and reliable.”

Parents trust ChatGPT’s advice over medical professionals. Maria Sbytova – stock.adobe.com

The team conducted a study with 116 parents aged 18 to 65 who were given health-related texts for children.

Each participant analyzed the content and determined whether they believed it was produced by ChatGPT or healthcare professionals without knowing the original author.

Although the study did not examine why parents trusted ChatGPT more, it details factors that may contribute to their preference.

Jim Boswell, president and CEO at OnPoint Healthcare Partners, who has experience developing an AI-based platform, believes ChatGPT’s straightforward approach to presenting information directly makes it easier for people to digest.

“I can understand why [parents]not knowing the source, would prefer the AI ​​formulation,” says Mordechai Raskas, MD, EdM, chief medical information officer and director of telemedicine at PM Pediatric Care. “Think of AI as the ultimate salesperson; she knows exactly what to say to win you over.”

Parents prefer to rely on AI because they can get quick answers to their problems without waiting for a doctor’s appointment.

Each participant analyzed the content and determined whether they believed it was produced by ChatGPT or healthcare professionals without knowing the original author. Kaspars Grinvalds – stock.adobe.com

However, while using ChatGPT can be a quick fix for many parents, it does come with some drawbacks.

“The information may be inaccurate or not adapted to specific circumstances. For example, suggesting medication to a child who is too young or providing the wrong treatment advice can lead to a wide range of dangerous outcomes,” says Leslie-Miller.

Experts suggest checking the sources of your AI-generated answers or consulting a medical professional before applying it to your condition.

“Reputable health content typically gives credit to qualified medical writers or health professionals and links to research-backed resources,” adds Boswell.

Parents enjoy the quick response that ChatGPT gives them without having to wait. AnnaStills – stock.adobe.com

Artificial intelligence like ChatGPT collects information from various sources on the Internet and summarizes it into a response. But when it comes to AI health information, these responses lack a medical expert’s opinion that is personalized to the patient.

“Relying on these tools for medical advice can lead to missed symptoms, misinterpretations of serious conditions, or delays in seeking appropriate care,” says Boswell. “For children, in particular, minor health issues can quickly escalate, so it’s essential to have a qualified professional assess a situation.”

Leslie-Miller recommends that parents also use trusted online medical resources such as the American Academy of Pediatrics (AAP), the Centers for Disease Control and Prevention (CDC) and the World Health Organization (WHO). Some hospitals also provide health information and advice from their health care providers.

“Reading online and researching can be very helpful,” says Dr. Raskas. “It just depends on the context and should be in conjunction with a trusted source or professional to help digest what you’ve read.”

#Parents #trust #ChatGPT #doctors #shocking #study #claims
Image Source : nypost.com

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top