As a pastor, most of my emails deal with the usual ministry matters: schedules for Bible study, comments about worship services, or misplaced Tupperware at a recent potluck. But lately, I’ve had several church leaders asking me questions about artificial intelligence (AI).
Some have requested resources on how to leverage its capabilities and avoid its dangers. Others have asked me for advice on how they can help their congregations avoid AI scams, like automated voice clones of their pastors calling to solicit money.
As an author of Redeeming Technology: A Christian Approach to Healthy Digital Habits, a PhD candidate in digital ecclesiology, and a pastor, I think often about emerging technologies and the church.
My Lutheran church tradition is not known for being particularly futuristic or technological—we once looked askance at lightning rods as an impediment to divine providence. But it’s not only Lutherans who are suddenly curious about AI. It seems like everyone is interested in AI today, including many who are worried about its dangers.
Geoffrey Hinton, known as the “godfather” of AI, recently quit his post at Google based on concerns about the lack of policy surrounding it. The Supreme Court recently weighed in on a case about internet regulations of computer algorithms. And hundreds of scientists, tech experts, and industry leaders recently posted a statement warning that AI poses a grave risk to humanity, comparable to pandemics and nuclear war.
Evangelical leaders have also produced statements, and AI experts have weighed in on how it may impact the future of theology and biblical interpretation. As pastors, we can help our congregations think through the potential impact of generative AI, for good and for evil.
For years, various forms of simple AI have been used in ministry: Church workers use it for voice dictation when composing sermons, Bible studies, or emails. Staff use it with remote check cashing for tithes and offerings. And pastors use it when utilizing search engines such as Google or Bing for doing sermon preparation or research.
Previous iterations of machine-learning AI applications could search and learn massive datasets. But behind many recent headlines is a relatively new form of technology known as generative AI, which uses the same mechanisms to create original content—including text, images, videos, or audio. Consumer apps such as ChatGPT, Dall-E, and Murf are all popular generative AI applications, in addition to Bard, Google’s new AI integration.
It is not difficult to imagine how pastors and church leaders can use these tools for the work of daily ministry.
Google products like Docs and Gmail can now assist users with a variety of basic writing tasks. For instance, church staff can type in a few prompts about an open staff position, and the generative AI will create a job description that is nearly ready to post. And since many churches cannot afford to pay for custom graphic design work, image creation applications like Dall-E can help create a logo or supporting images for a sermon series.
The same goes for video. While it once took weeks and a substantial budget to create video content, generative AI platforms like Synthesia enable churches to create this content quickly and with minimal cost. For instance, staff can easily produce high-quality small group Bible study videos with voice-over audio and customized graphics.
Most of these programs help facilitate the ministry of the local church. But others can be more problematic.
Church staff can use AI chatbots to generate content for email correspondence with visitors or congregational newsletters—like a welcome note after someone attends a service and provides an email address. On the other hand, some congregants might be offended if a church leader uses generative AI to create personal messages, particularly if the messages are about sensitive matters.
Pastors or staff members can use ChatGPT to create sermon illustrations or entire sermons based on Bible verses that emphasize specific themes. But doing so can raise questions about plagiarism, authenticity, the creative process of sermon development, and even the nature of preaching itself. As Russell Moore said, “A chatbot can research. A chatbot can write. Perhaps a chatbot can even orate. But a chatbot can’t preach.”
There are also several potential scams church leaders should be aware of and warn their congregations about.
For example, a scammer can access audio content from livestream worship services or sermons posted online and upload the sample audio into a generative AI application (such as Murf, Speechify, or Resemble AI), which learns the tone, cadence, and inflection of a pastor’s voice. This is known as voice cloning. The scammer can then create audio that sounds like the pastor or church leader asking for money and use it to make calls to unsuspecting parishioners.
Some deeper concerns about generative AI are that it will cause massive problems through “deepfakes” or fabricated images and videos of public figures. Scammers out to harm church leaders could make them appear to say or do morally compromising things. Or the opposite problem is also possible—church leaders could engage in morally compromising activities and claim that the evidence is a fabrication.
Fears about massive workforce shifts prompted by generative AI are also emerging. Generative AI poses an existential crisis for fields such as journalism, graphic design, and even law—since AI apps can easily compose legal briefs and judicial rulings. Such occupations, which were once impervious to technology, could someday be replaced by AI. This may prompt anxiety among your parishioners about the future of their careers.
Thus, pastors and church leaders should avoid extremes when engaging with their communities on this topic. Publicly eschewing all forms of AI in the church is problematic, especially since many could already be using some form of this technology unwittingly.
Conversely, church leaders should avoid furtive uses of AI: If you’re unwilling to tell your church that something was composed using ChatGPT, then don’t use it. The best course of action is to foster a proactive and transparent conversation within your congregation about the ethical usage of generative AI.
Pastors can also offer their parishioners some historical context to address some of the contemporary concerns about this emerging technology.
New technology has always been a source of fear, and sometimes more so by Christians. In the 15th and 16th centuries, for example, the printing press was a culturally disruptive technology that many within the church initially feared and rejected—and yet it had an incredible impact on global Christianity. Some argue that Bible software apps of the digital age have had a similar influence on the way we read Scripture today.
I’m of the mind that the short-term panic surrounding generative AI (although not all AI) is overblown. But there will be many ethical implications to address in the long term.
According to media theorist Marshall McLuhan’s four laws of media, a balanced approach to any new technological medium involves considering four key questions: What human function will be enhanced by this technology? What could be made obsolete? What may be retrieved from the past? And if pushed to its limits, how might this new technology reverse previous progress? These four lenses can be helpful when thinking through the potential impact of generative AI as it relates to the church.
On the positive side, content creation done in service of the gospel—which Christians have been doing for generations—might be broadly enhanced by generative AI. Similarly, the church has a long history of relying on subject-matter experts and polymaths. And in some sense, AI retrieves this historical pattern by functioning as a savant in service to the church.
But composition and design work—something churches have also been doing for millennia—could be made obsolete by this new technology. And some fear that if pushed to its limits, AI has the potential to do all our thinking for us and thus reverse the progress of human knowledge, ultimately making the world and the church less human.
Either way, it won’t be long before generative AI technology is woven into the background of our church lives—which is why it’s important to learn how to engage with it wisely rather than try to avoid it altogether.
A. Trevor Sutton is a Lutheran pastor in Lansing, Michigan. His most recent books include Redeeming Technology: A Christian Approach to Healthy Digital Habits (coauthored with Dr. Brian Smith, MD, Concordia Publishing House, 2021).