UNIVERSITY PARK, Pa. — Customizing voice assistants like Alexa or Siri to sound similar to you makes them more likable, according to researchers from Penn State. Their study reveals a strong preference for voice-activated devices that sound extroverted — speaking louder, faster, and in a lower pitch.
When voice assistants matched a user’s personality, people were more likely to give better ratings for the device. Some of the feedback was that the voice assistant sounded more socially and intellectually attractive. People also found it more trustworthy.
“This tendency to equate perceived similarity to credibility was more pronounced among those who customized their experience by choosing a preferred voice for the assistant,” says S. Shyam Sundar, the study’s co-author and the James P. Jimirro Professor of Media Effects at Penn State, in a university release.
However, people were resistant to hearing about COVID-19 vaccine misinformation when the voice assistant sounded similar to the user. For example, only 38 percent of unvaccinated individuals became more open to vaccination after hearing about vaccine misinformation shared by a virtual assistant. The authors note the resistance to certain pieces of information from the voice assistant was unexpected.
“People often show resistance to persuasive attempts by information sources, like pundits or social media influencers,” explains co-author Eugene Snyder, an assistant professor of humanities and social sciences at the New Jersey Institute of Technology.
“For the unvaccinated study participants, being faced with misinformation from a VA similar to themselves may have created a kind of resistance. However, further work is needed to clarify this reaction since unvaccinated individuals were a minority in our sample, accounting for 27% of study participants.”
The surprising findings could provide a deeper glimpse into how humans process information.
Over 400 people enrolled in the study, where they self-rated their level of extroversion. Researchers randomly assigned them to one of three groups where participants listened to voice assistants with either a robotic voice, a customized voice, or one personalized to the user. People in the control group also received an introverted or extroverted robot voice assistant.
After the voice assistants gave a small introduction, people rated them based on attractiveness and service quality. The researchers also played audio clips of the voice assistants responding to COVID-related questions with misinformation. After hearing these clips, people rated the voice assistants again in the same categories and how much they trusted what the voice assistants were saying. People’s attitudes toward the COVID vaccines were also taken into account.
Customizing voice assistants led to more positive interactions with the voice assistants. This was more noticeable in people whose voice assistants sounded like them.
“This research suggests that we can combine personalization and customization features to create better user experiences,” adds co-author Saeed Abdullah, an assistant professor of information sciences and technology at Penn State. “Instead of just providing users with automated personalization or asking them to customize the whole system, maybe there is a point, a balancing act, in which you can offer automatically generated customization options that combines these two aspects and leads to better user satisfaction and a more careful assessment of information.”
The study is published in the International Journal of Human-Computer Studies.
You might also be interested in:
- Voice assistants like Alexa and Siri can negatively impact a child’s social development
- Should you thank Alexa? Young kids show more kindness towards AI than adults
- Voice assistants may also do a good job serving as life coaches for humans