Religion and AI: Humans aren’t ready to accept robot preachers, study reveals

CHICAGO — Who knew the word of God would one day be delivered by artificial intelligence. They say the Lord works in mysterious ways, but futuristic new findings suggest congregations far and wide just aren’t ready to hear sermons from robot preachers.

Researchers working with the American Psychological Association report that while robot preachers and AI may offer some new advantages when it comes to sharing religious beliefs, these new technologies also tend to undermine the credibility of and reduce donations for religious groups that rely on them.

“It seems like robots take over more occupations every year, but I wouldn’t be so sure that religious leaders will ever be fully automated because religious leaders need credibility, and robots aren’t credible,” says lead researcher Joshua Conrad Jackson, PhD, an assistant professor at the University of Chicago in the Booth School of Business, in a media release.

These findings are based on an experiment conducted with the Mindar humanoid robot at the Kodai-Ji Buddhist temple in Kyoto, Japan. That robot features a humanlike silicon face with moving lips and blinking eyes on a metal body. It delivers 25-minute Heart Sutra sermons on Buddhist principles – complete with surround sound and multi-media projections.

The robot was created in 2019 by a Japanese robotics team in partnership with the temple, and cost close to one million dollars to develop. After all that, though, the study suggests Mindar may actually be reducing donations to the temple.

Older man praying in church
Photo by Stefan Kunze on Unsplash

Study authors surveyed 398 participants leaving the temple after hearing a sermon delivered by either Mindar or a human Buddhist priest. Interviewees saw Mindar as less credible and gave smaller donations than those who heard a sermon from a human priest.

Yet another experiment was held in a Taoist temple in Singapore. Half of the 239 participants heard a sermon by a human priest while the other half heard the same sermon from a humanoid robot named Pepper. Sure enough, this experiment produced similar results as the first, the robot was seen as less credible and inspired smaller donations. Moreover, people who heard the robot’s sermon told researchers they were less inclined to share its message or distribute flyers to support the temple.

Still, all things considered, the robot preachers fared well enough. While participants generally preferred hearing from a human, it was still a close contest with the robots. On a scale from one to five, with five being the most credible, robot preachers received an average credibility rating of 3.12. Humans received a score of 3.51.

“This suggests that there are a lot of people out there who think robots could be effective preachers, but there are more people who aren’t convinced,” Prof. Jackson explains.

While these experiments focused entirely on Eastern religions, Prof. Jackson theorizes the findings may apply to other religions.

A third experiment encompassed 274 Christian participants from the United States who read a sermon online. Half of that group were told a human preacher wrote the sermon while the other half believed they had read something produced by an advanced AI system. Participants in the AI sermon group reported seeing the sermon as less credible. Why? They felt AI still does not have the capacity to think or feel like a human.

“Robots and AI programs can’t truly hold any religious beliefs so religious organizations may see declining commitment from their congregations if they rely more on technology than on human leaders who can demonstrate their faith,” Prof. Jackson concludes.

The study is published in the Journal of Experimental Psychology.

You might also be interested in:

YouTube video

Follow on Google News

About the Author

John Anderer

Born blue in the face, John has been writing professionally for over a decade and covering the latest scientific research for StudyFinds since 2019. His work has been featured by Business Insider, Eat This Not That!, MSN, Ladders, and Yahoo!

Studies and abstracts can be confusing and awkwardly worded. He prides himself on making such content easy to read, understand, and apply to one’s everyday life.

The contents of this website do not constitute advice and are provided for informational purposes only. See our full disclaimer