In the past several weeks, two events occurred which can be going to vary our futures. One of them was the launching of OpenAI’s recent artificial intelligence program, GPT-4o, just ahead of several competitors who will do the identical in a matter of weeks. The other was the defrocking of a robot priest for teaching that baptisms might be done with Gatorade. I’m afraid the church shouldn’t be ready for either.
The more talked-about happening was the OpenAI announcement, complete with videos of the AI program laughing, seeming to blush, telling jokes, seeing and describing things in real time, and even singing songs made up on the spot (to whatever degree of emotion and enthusiasm was demanded).
Far less culturally noticed was the incontrovertible fact that just a couple of weeks before, the Roman Catholic apologetics platform Catholic Answers reined in an AI chatbot called “Father Justin,” which was designed to assist people through questions of doctrine and practice.
People began to get upset when Father Justin began claiming to be an actual priest, able to hearing confession and offering sacraments, and when it began giving unorthodox answers to questions, resembling whether baptizing a baby with Gatorade could be all right in an emergency (the magisterium says no).
Now Father Justin is just “Justin,” a “lay theologian.” Catholic Answers acknowledged to critics that they’re pioneering a recent technological landscape and learning—as the entire world will—just how difficult it’s to maintain a synthetic intelligence orthodox. If my Catholic friends thought Martin Luther was bad, wait until the robots start posting theses to the cloud.
Before one laughs at Catholic Answers, though, one should think concerning the now-quoted-to-the-point-of-cliché anecdote of Nineteenth-century preacher D. L. Moody’s response to a critic of his evangelistic practices: “But I like my way of doing it higher than your way of not doing it.” Behind the scenes, almost every forward-thinking ministry of any kind is frightened about the way to be ready for an AI-transformed world, imagining what it might have been like if Luther had not been ready for a Gutenberg era or if Billy Graham had not been ready for a television age.
One AI expert told me recently that he and others are realizing that folks will say to an AI what they might never admit to a human being. Doctors know, for instance, that when asking a patient, “How much do you drink each week?” they are going to get one answer from a possible problem-drinker while a chatbot will get what’s much closer to an honest answer.
The same is true relating to spiritual searching, this expert said. The one that would never ask a Christian person, “What will occur to me after I die?” or “Why do I feel so guilty and ashamed?” is way more more likely to ask such inquiries to an intelligence that’s not one other person. In some ways, that sounds oddly near Nicodemus, who got here to ask questions of Jesus at night (John 3:1–2).
“The query shouldn’t be whether people shall be looking for chatbots for large questions like that,” the expert told me. “The query shall be whether the one answers they get are spiritually unsuitable.”
The real challenge may prove to be not a lot whether the church can advance fast enough to see a synthetic intelligence world as a mission field—reasonably, it’s if it can be ready for the conflicted emotionality we noticed even in most of our responses to the OpenAI announcement videos themselves.
The videos provoked for many individuals an almost moon landing–level of wonder. As I said to my wife, “Watch this. Can you think the way it tutors this kid on a geometry problem?” I noticed that, at some point, my response would feel as “bless your heart” naive because the old videos of television anchors debating one another on the way to pronounce the “@” symbol within the then-new technology called email.
At the identical time, though, the videos sort of creeped a number of us out. The vague feeling of unease is described by psychologists as “uncanny valley.” It’s the rationale numerous people could be terrified to be trapped inside a doll-head factory or in a storage shed stuffed with mannequins. Human beings are inclined to respond with dread to something that’s close enough to appear lifelike but doesn’t quite get there. Something our brain desires to read as each “human” and “non-human” or as each “alive” and “dead” tends to throw our limbic systems off-kilter.
Print and radio and tv and digital media have their effects on the communication of the gospel, as Marshall McLuhan and Neil Postman warned us. But what those media retained in common with oral proclamation was a connection, nonetheless tenuous, to the private. One won’t know who wrote a gospel tract one finds on the street, but one does know there’s a human being somewhere on the market on the opposite side of it.
On the one hand, I’m almost persuaded by the argument that one could put AI in the identical category because the quill Paul used to pen his epistles or the sources Luke compiled to write down his gospel. AI programs are designed by human beings, and the Word of God comes with power whatever the format.
Even so, that doesn’t appear to be the entire story. Do people experience the “uncanny valley” unease here simply because it’s a recent technology to which we’re not yet accustomed? Maybe. Or possibly there’s more to it.
A number of weeks ago, the Sketchy Sermons Instagram account featured a cartoon rendering of a quote from the comedian Jaron Myers: “I’ve seen too many youth pastors be like ‘Be careful on TikTok, it’s just girls dancing in swimsuits’ and I’m like bro … It’s an algorithm.”
The joke works because we live now in an ecosystem where every little thing seems hyper-personalized. The algorithms appear to know where an individual’s heart is best than that person’s pastor or that person’s spouse and even that person’s own heart. If you want knitting content, you see knitting content. If you want baby sloth videos, you see baby sloth videos. And in case you like bikini-dancing—or conspiracy theories or smoking pot—you get that content too.
That hyper-personalization is satirically the very reason this era seems so impersonal. Even if a machine seems to know you, you may’t help but realize that what it knows is the way to market to you.
The gospel, though, can’t be experienced as anything but personal. If the Word of God is breathed out by the very Spirit of Christ (1 Pet. 1:11), then once we hear it, we hear not only “content” or “information” or disconnected data curated by our curiosities and appetites. We hear him.
How does one convey that in a world where people ponder whether what they’re hearing is just the inputs from their very own digital lives, collected after which pitched back to them?
That so many are queasy once they see a friendly, helpful, seemingly omniscient AI might tell us something about ourselves. Despite the caricature, philosopher Leon Kass never said that “the wisdom of repugnance” is an argument, for or against anything. What he wrote was that once we feel some kind of revulsion, we should always ask why. Sometimes it’s just cultural conditioning or the fear of the unknown—but sometimes it’s “the emotional expression of deep wisdom, beyond reason’s power fully to articulate.”
Should we conclude that God is in a position from these chatbots to boost up children for Abraham? How will we make sure that that, when individuals are thirsting for living water, we don’t give them Gatorade?
What I do know is that no recent technology can overcome one in all the oldest technologies of all of them: that of a shepherd leading a flock together with his voice. Yea, though we walk through the uncanny valley of the shadow of knowledge, we should always fear no evil. At the identical time, we’ve to be ready for a really different future, and I’m unsure we’re.
Russell Moore is the editor in chief at Christianity Today and leads its Public Theology Project.