When in search of a cure for common ailments, most people tend to use search engines like Google, or even scan social media platforms like TikTok for remedies. However, those sources aren’t always vetted for reliability. Now with artificial intelligence rapidly evolving, a new study from FIU Business explores the potential of ChatGPT in providing health-related information.
Led by Pouyan Esmaeil Zadeh, associate professor of information systems and business analytics, the study examined how users perceive AI-generated health guidance and whether they would take the advice, by applying the Uses and Gratifications Theory (UGT) to determine satisfaction.
“We want to understand why people may use ChatGPT or conversational AI in order to solve some of their health care problems,” said Esmaeil Zadeh.
The study forthcoming in Computers in Human Behavior: Artificial Humans, surveyed 231 U.S. participants to analyze their interactions with ChatGPT when seeking information on common ailments, in this case back pain.
“We asked participants to either imagine themselves in a hypothetical health-related situation or reflect on any existing health concerns they might have. We then requested them to use ChatGPT to ask relevant health-related questions. ” Esmaeil Zadeh explained.
The findings reveal that both practical and emotional factors, utilitarian and hedonic when referring to UGT, significantly influence user satisfaction and willingness to trust AI-generated medical advice.
Survey results indicated that 78% of participants found ChatGPT’s responses to be clear and actionable, with higher satisfaction reported among those who engaged in longer conversations exceeding four prompts.
The study also revealed that users prioritize functional and practical benefits over emotional gratification when using ChatGPT for healthcare information.
“People first care about the quality of information,” said Esmaeil Zadeh noting that ChatGPT can also cite sources or provide articles linking to the information it generates.
The study further highlighted that users prefer AI-generated responses that are concise, jargon-free and easy to understand.
But some respondents noted a need for good bedside manner from the chatbot. The research found that a lack of empathic communication could reduce motivation to engage with AI-powered healthcare tools. This emphasizes the power of developing engaging dialogue and human-like conversation embedded in conversational AI.
Esmaeil Zadeh said that ChatGPT is particularly useful for non-emergency situations where users seek general information, advice or clarification- and it can help them avoid an insurance co-pay.
“Instead of booking appointments or contacting healthcare providers for minor concerns, users can get relevant information through ChatGPT,” said Esmail Zadeh. “This makes it a cost-effective and convenient alternative to individual consultations.”
However, generative AI still may not be a reliable option for addressing serious health conditions. Esmaeil Zadeh hopes that developers use this study to inform the fine tuning of AI tools for the healthcare landscape to be as comprehensive and user-friendly as possible.
“That's the way that you build trust,” said Esmaeil Zadeh. “It is important for developers to recognize the utilitarian and hedonic components required for healthcare-related platforms. When users prefer a single AI-based platform over multiple websites or databases, they expect clear, up-to-date, and reliable information to support their decision-making.”