Chatbot bias: Do human patients judge AI medical avatars based on their appearance?

AURORA, Colo. — Are robots in the healthcare system actually encountering racial bias from patients? Chatbots and other forms of artificial intelligence are making their way into almost every industry, especially healthcare. However, researchers from the University of Colorado School of Medicine are looking into patients’ experiences with these AI systems to see if people change the way they act and respond to medical AI systems based on their perceptions of the robot’s appearance.

“Sometimes overlooked is what a chatbot looks like – its avatar,” the researchers explain. “Current chatbot avatars vary from faceless health system logos to cartoon characters or human-like caricatures. Chatbots could one day be digitized versions of a patient’s physician, with that physician’s likeness and voice. Far from an innocuous design decision, chatbot avatars raise novel ethical questions about nudging and bias.”

“If chatbots are patients’ so-called ‘first touch’ with the health care system, we really need to understand how they experience them and what the effects could be on trust and compassion,” says Annie Moore, MD, MBA, a professor of internal medicine, in a university release.

The researchers first took note of the rise in chatbot use during the COVID-19 pandemic. Since then, they’ve found it incredibly important to bring the bioethics of their use into question in order to ensure they improve patient outcomes.

“Many health systems created chatbots as symptom-checkers,” says Associate Professor Matthew DeCamp. “You can go online and type in symptoms such as cough and fever and it would tell you what to do. As a result, we became interested in the ethics around the broader use of this technology.”

The team points out that while avatars are mostly considered to be marketing tools, their appearance can play more of a role than one might initially assume.

“One of the things we noticed early on was this question of how people perceive the race or ethnicity of the chatbot and what effect that might have on their experience,” adds DeCamp. “It could be that you share more with the chatbot if you perceive the chatbot to be the same race as you.”

Robot financial advisor on computer
(© tiagozr – stock.adobe.com)

In an attempt to answer many of their ethical questions, the team surveyed over 300 people and interviewed 30 others about their interactions with bots in healthcare. So far, evidence suggests that people are more willing to share information with chatbots than they are actual humans. However, this begs the question: “We can manipulate avatars to make the chatbot more effective, but should we? Does it cross a line around overly influencing a person’s health decisions?” asks DeCamp.

Chatbots could also reinforce stereotypes. For example, bots that display feminine features may reinforce biases on women’s roles in healthcare. Further, they could be viewed as more trustworthy among certain patient populations — specifically those that have historically had less access to healthcare — if these patients are able to choose the avatar they want to talk to. This is turn could make some patients more likely to adhere to medical recommendations.

“This is not surprising,” the researchers write in their report. “Decades of research highlight how patient-physician concordance according to gender, race, or ethnicity in traditional, face-to-face care supports health care quality, patient trust, and satisfaction. Patient-chatbot concordance may be next.”

The research team is now encouraging the medical community to recognize the nuances of using chatbots in healthcare and to ensure that they are using them to advocate for health equity.

“Addressing biases in chatbots will do more than help their performance,” the study authors conclude. “If and when chatbots become a first touch for many patients’ health care, intentional design can promote greater trust in clinicians and health systems broadly.”

The findings are published in the journal Annals of Internal Medicine.

You might also be interested in:

YouTube video

Leave a Reply

Your email address will not be published. Required fields are marked *