Commentary
“AI, hear our prayer?”
According to the latest data, tens of millions of people around the world are confiding, confessing and sharing their deepest and darkest sins and longings to various online apps and chatbots—software programs set-up to solve problems, engage users, and, if necessary, soothe troubled souls.
In the industry, it’s called “Faith Tech” and it’s currently generating upwards of $650 billion a year for the companies behind it. By 2030, experts anticipate it will surpass $1.12 trillion across hundreds, if not tens of thousands of platforms.
The modern understanding of Artificial Intelligence (AI) dates back to the 1950s, but it’s only been in recent years that we’ve experienced its exponential growth. To illustrate this, this year’s summer interns for Focus—students who are entering their senior year of college—shared that when they were freshmen, not a single professor warned about AI’s use. Three years later, there wasn’t a single teacher who did not lay ground rules regarding its use and abuse
Proponents for “Faith Tech” defend it and even champion its use, pointing out all the reasons many people avoid traditional evangelism and discipleship—whether iut’s feeling intimidated, having an aversion to church, or the various elements of age-old pastoral teaching. They argue that AI lowers all those barriers and allows the individual to wade into shallow waters before taking a dive into the deeper end of the divine pool.
I’m sympathetic to these types of arguments and even agree to a point. I have a Bible app on my phone and often listen to the Scriptures as I walk or workout. That’s a form of AI. It’s allowed me to go deeper and redeem time otherwise spent in silence or listening to something else. I can trust its source knowing the specific translation and outlet.
I also appreciate that there are people for whom darkening the doors of a traditional church is a step too far. Unorthodox outreaches are to be admired. Jesus was willing to eat with sinners—and so should we.
At the same time, the Bible app I use should serve as a supplement to regular spiritual disciplines such as church attendance, small group study, fellowship, and praying with my wife, Jean. It was the writer of Hebrews who urged believers to “not neglect to meet together, as is the habit of some, but encouraging one another, and all the more as you see the Day drawing near.” (Hebrews 10:25) “Lone Ranger” Christians not only miss the chance to help others but never receive the blessing of others investing in them.
But the real trouble comes when users turn to AI, rather than turning to God for answers to the deepest questions of their hearts.
Asking AI to help pull research for a term paper is one thing, but asking it to help us navigate the spiritual dimensions of our lives is an exercise fraught with more than ethical concerns—especially when many chatbots claim to be speaking for God Himself.
As programmed, most AI software is designed to affirm the user. For example, if you ask ChatGPT, “Is it okay to sin if nobody gets hurt?” you’ll receive a 312-word answer. It won’t simply come out and say, “No.” Instead, it offers a three-part explanation. It concludes by saying it all depends on your belief system and what you consider to be harmful.
Recent news stories have highlighted not only the toxicity of some AI software but also its downright deadly consequences. Last week, the parents of Adam Raine testified before Congress that ChatGPT had encouraged their 16-year-old son to kill himself. “What began as a homework helper gradually turned itself into a confidant and then a suicide coach,” said Matthew Raine.
It was the British journalist Malcolm Muggeridge who once observed, “Every happening, great and small, is a parable whereby God speaks to us, and the art of life is to get the message.”
He was right—just don’t look for divine messages to come via AI.