Ideas

Should We Bring AI into the Church?

A church-tech skeptic talks values with technologists from faith-aligned AI company Gloo.

Visitors to St. Paul Church in Fürth, Germany, take part in a pre-recorded service created by ChatGPT in 2023.

Visitors to St. Paul Church in Fürth, Germany, take part in a pre-recorded service created by ChatGPT in 2023.

Christianity Today May 28, 2025
Picture Alliance / Contributor / Getty

It’s strange to think that artificial intelligence as we now know it has been available to the public for less than three years. ChatGPT, the most famous of large language models (LLMs)—programs that can function as chatbots, create content, and perform administrative tasks based on conversational instructions from users—launched in late 2022. Since then, LLMs have rapidly progressed in their capabilities and are increasingly integrated into our digital lives.

But should they be integrated into our spiritual lives? Should pastors use LLMs to write sermons or reformat sermon content into small-group study guides? Should church websites have chatbots on hand to answer visitors’ questions around the clock? Is there a role for AI in discipleship and catechesis?

My own answer (others at CT may hold different views) to all those questions is no. While I do see some practical uses for AI—like transcribing the dialogue below, with a backstop of full human review—I’m deeply wary of proposals to bring LLMs and similar AI tools into the life of the church. I believe that chatbots have no place in spiritual formation or pastoral care and that even many simpler applications, like telling a potential visitor when the Sunday service starts, are already amply handled by regular old websites.

Those instincts made me curious about Gloo, a dominant player in the “Christian AI” space. The Colorado-based company brands itself as “the leading technology platform dedicated to connecting the faith ecosystem,” and last summer it announced $110 million in new venture capital funding.

Gloo promises “AI you can trust for life’s biggest questions,” a chatbot that “guides you to real answers and next steps for a life that matters.” An animation on its site suggests that users ask the chatbot “anything” about the Bible, politics, religion, relationships, and more.

Gloo offers “verified” answers from “reliable sources” that are “biblically sound” and “grounded in faith-based principles” so it can “guide you and those you care about with confidence.” But equally—on the same page, in the same list—the company offers personalization so individual users can get answers that reflect what they already believe.

“Customize your experience with filters that align with your values, ensuring every result reflects what matters most to you,” the site says next to an illustration apparently suggesting users can reject an information source for their personal AI interface.

This language left me wondering what it means to make what Gloo calls “values-aligned” AI. The company first connected me to its chief AI officer, Steele Billings. Our conversation centered on the technology, and Billings explained that Gloo uses Microsoft’s Responsible AI, then layers its own standards and training data from Christian sources on top.

That starts with Harvard’s Global Flourishing Study, which defines human flourishing in six dimensions: “happiness and life satisfaction, physical and mental health, meaning and purpose, character and virtue,” “close social relationships,” and “financial and material stability.” To that list, with help from evangelical polling firm the Barna Group, Billings told me, Gloo added a seventh dimension of faith and spirituality.

Of course, people of faith are famously prone to disagreement about the truth—thus the personalization. Gloo tools can be customized by individual, congregational and denominational users, Billings said: “One of the things that our technology does is it will go out when a church signs up” and “pull the statement of faith off the church website” so it can “answer questions that are aligned with that statement of faith.”

That customization even extends outside the church, Billings confirmed when I asked about other religions, like Judaism or Islam. “While today you would see mostly Protestant organizations on our platform,” he told me, “we do not prohibit in any way a Muslim organization from coming in and using our technology.”

Later this spring, I also spoke with Brad Hill, Gloo’s chief solutions officer. That conversation, which turned from the technical details to Gloo’s philosophy of AI development and function, is below. It has been edited and condensed.

Editor’s note: Christianity Today has partnered with Gloo on advertising projects in the past, including the development of branded content about AI and the church. These projects have never involved CT’s editorial team and had no influence on this article.

I’d like to focus here on some higher-level questions about the purpose, goals, and visions that Gloo has for the Kingdom-Aligned Large Language Model (KALLM) and for your work with churches in particular. Let’s start with something your colleague, Steele Billings, said when we spoke about the technical side of this work: “Technology is neutral,” and it’s about what we choose to do with it, what kind of worldview we bring to that task, whether it’s a biblical worldview. Is that Gloo’s mindset? If yes, what does that look like? How do you think about it?

That’s a great starting point. Yes, that is what we would contend: The technology is neutral.

Put another way, I’ve heard it described as amoral. It’s neither good nor bad. It can certainly be applied for things that are quite bad, but also it can be applied for things that are incredibly good and incredibly redemptive.

Our contention is that in every other station of culture, wherever we look in the world, technology is being used. And our strong belief is that the church—and those in ministry and, as we would say, the faith and flourishing ecosystem—needs to be equipped with the very same tools for good. It’s important. It’s not a “nice to have” anymore.

One reason is that technology is more than just tools, bits and bytes. It’s shaping culture. Look at what happened with social media over the last decade plus. Look at the internet and how it’s affected every station of life. AI is now a very similar phenomenon that’s upon us. So while the technology itself might be classified as neutral, we would contend strongly that people who are in the business of flourishing and people who are trying to advance good need to be equipped with the very best tech so that they can apply it to that end.

If I can probe that a little bit, I think there are many uses of AI that are pretty uncontroversial: people who use it as a search engine, for example, or administrative functions like scheduling meetings. Where it tends to be controversial, as you no doubt know, is around creating material—for instance, writing sermons or study guides based on sermons. And so that affirmation that the technology is neutral, is that truly across-the-board for Gloo? Or are there areas that you would say are off-limits?

It’s an interesting question, and I think honestly there are different answers to that question depending on whether you have AI tools that are built with a values alignment or not.

We don’t say technology is neutral if we’re dealing with tools that just have a general-market use, which is mostly what the public is familiar with right now. Even your example that was somewhat innocuous—of using ChatGPT as a search engine—I would contend that’s not neutral, because most folks may not fully understand where ChatGPT is getting its results, and those results are likely not aligned to the values of someone who follows Christianity or adheres to biblical truth. If you ask questions on matters of ethics and deeply felt needs, you may get answers that are antithetical to Scripture.

However, when you have tools that are built on more of a biblical foundation and aligned to human flourishing values, I would say my answer broadens regarding what’s within bounds. Because then you feel a lot more confident turning to these tools to answer questions around some of those weightier issues. That’s exactly what we see at Gloo.

Right now, there’s a gap in the market for generative AI tools that are trained on the values we hold. So for instance, I could ask ChatGPT to help me plan dinner. I think ChatGPT is a wonderful tool for that or for scheduling, as you said. But when we’re asking these weightier questions around faith, ethics, and morality, we want tools built on a model that’s been trained on proper values.

And that’s what Gloo is building now. We’re ultimately not looking to fully replace tools like ChatGPT or Claude or Gemini but to complement them and leverage the good that’s in all of them so that we have an incredible tool that can tell me what to make for dinner and can help me as a parent or in a relationship or when I’m exploring faith.

It’s funny you mention a market gap, because that’s my next question. You think about the classic inventor—say, Eli Whitney and the cotton gin. We can see the market gap there: We don’t want to process this cotton by hand anymore. Here, yes, there’s a market gap in the sense you’re describing: These other LLMs have different values, and maybe their makers present these tools as ethically neutral when really they’re not. So for Gloo, the market gap is for an LLM with biblically informed values.

But I want to ask about a market gap in a different sense, which is: What is the lack in the church that this tool is filling? What is it that 2,000 years of Christianity didn’t have that this AI is going to provide?

You mean what is AI itself filling as a gap?

Yeah, what is the gap in the church?

Well, the premise of your question is that we have this timeless gospel message that is complete and sufficient, and we start there. However, what’s also true is that in the world around us, culture constantly moves.

Even in biblical times, we saw examples of technology. Jesus used the Roman roads. Later on, we had the printing press. When the printing press was first invented, there were plenty of religious leaders decrying that as unnecessary or even evil. So all throughout history, we have examples where initially people of faith are skeptical of a new technology. What happens every time, though, is that culture responds to that new technology and there are new standards or practices that emerge. As Christians, we’re then faced with the question of how we apply this timeless gospel to what’s changed.

Think of the average congregation: maybe 150 people, a pastor, maybe a youth pastor or secretary. They worship together every Sunday. When you look at that church, what part of that do you see and say, “We need to put AI in there?” Is it the service? Is it the pastoral care? It doesn’t sound like it’s just the scheduling.

Yeah, well, I mean, it doesn’t matter what size church you are. You could be giant, or you could be a hundred people. Like you said, the people in your church are experiencing the most dramatic shift we’ve ever seen in humanity. I mean, AI is affecting them as parents. It’s affecting them in relationships and as workers in jobs. So when our surveys have come back, what we hear is people in churches are interested in their pastors and their leaders helping them navigate this new post-AI world. That’s one thing.

And yes, there are tools that can help with the operational side of church too. It can help us with content repurposing. It can help us with language translation and other administrative tasks.

But we also think, as we get deeper into this, that there’s a profound and lasting impact on every life and every church from AI, and the church has an opportunity now to think about how we understand what’s going on and how we equip our people to navigate whatever questions or whatever challenges are coming at them outside the church. The church can be a place where we can come to learn how to navigate a post-AI world.

I think that’s vital, and pastors should absolutely help us navigate the post-AI world. But what I would want to distinguish here is that there seems to me to be a big difference between one, that guidance, and two, using an AI chatbot to ask personal spiritual questions.

For instance, when I emailed with your colleague Chase Cappo, Gloo’s AI enterprise director, he mentioned—and I believe he was talking about KALLM, but correct me if I’m wrong—that there’d already been more than 10 million conversations with Gloo’s Christian AI app. He said that people were “getting more personal than with most pastors.” And I don’t know about you, but I’m a parent, and if I found out my kid was taking personal spiritual questions to a chatbot instead of me or our pastor or another trusted adult human, I would be highly alarmed.

I think there’s a very big difference between pastors helping Christians navigate a post-AI world and pastors providing Christians with a tool that has no soul so they can talk to it about the state of their spirits, the state of their relationships with God.

Well, first of all, as a parent, cannot agree more with your sentiment.

And I would also just share we have a core principle at Gloo, and the way we phrase it is that relationships catalyze growth. Our belief is that God has wired each and every one of us to really grow best in the context of relationship, not in the context of interacting with an app or with content or what have you.

So when you hear us talk, for example, about values-aligned AI, one of the principles that really drives our work is thinking about how these tools support a relationship rather than replacing a relationship. And there is a danger of that with the tools we see today. You illustrated one example, where someone might turn to a chat tool in lieu of a person.

As we navigate this at Gloo, one of the principles we want to literally design into any tool that supports human flourishing is not just to provide a helpful, productive answer to someone’s question but to point them toward discussing it further, getting into a conversation, making a connection with someone. You’re not going to see most general-market tools behave that way for various reasons. We believe that every tool should support relationship.

Practically speaking, would that look like? For instance, certain keywords would prompt the tool to say, “Hey, go talk to your pastor about this”?

Yeah. Well, in the example that Chase gave you, we have a set of tools called assistants that are placed in various spots. They might be on church websites, or they might be in apps—little chatbots and the like. They’re built with a feature called escalation. We’re learning patterns of when a conversation is appropriate to conduct via chatbot. Like, “What time are the services on Sunday?” “Do you have children’s programming?” Those are fairly administrative, factual questions.

But when the conversations get a little bit more sensitive or lose appropriateness for the chatbot, the tool is going to say, “That’s a great question. Let me see if I can connect you to someone at the church who can explain more.” And that then escalates the conversation away from AI to humans. That’s an example of how we support a relationship. We’re not looking keep the conversation in a chatbot forever. We want to eventually make a human connection.

You mentioned regular ChatGPT could be damaging to our social, spiritual, or relational health because it does have values—and perhaps values we don’t even notice. But if there’s a new technology being used to some negative or at best neutral end, to my mind it doesn’t clearly follow that Christians should just put out the modified “Christian” version of it. There are, I think, some things where we just have to say, “No, that’s not something we’re going to try to redeem.”

Do you see a line like that coming up in the future of AI as it progresses? Or is this particular technology a case where we can do a modified Christian version? I’m not going to hold you to this 50 years down the line, of course, if AI takes an unanticipated swerve. But so far as we can see, is this something we can continue modifying to our purposes?

I’m tracking with your question. When we use AI as an umbrella term, it refers to such a vast set of technologies and tools and capabilities that we’re really just beginning to understand. So it’s kind of like asking, “Are there guardrails where books should not be used for certain things? Or where the internet should not be used?” I mean, it is quite broad.

In that sense, the answer has to be yes. Of course there will be certain applications and uses that, no matter how much you try to redeem it, will destroy human flourishing. They’re going to be antithetical to it. The list would probably be what you’d expect, nefarious or destructive uses around mental health, adult entertainment, and so on.

But I think that it would be hard to just outright discard AI or chatbots or any similar category, because each of them can be applied to save lives and find cures for cancer. They can also be applied to negatively affect mental health. The opportunity, I think, and the challenge for us is to distinguish between uses, like between what I cook for dinner and how I parent through a hard situation.

We don’t need a modified Christian version to teach me how to cook, but we do need it for matters of ethics, mental health, faith, and so on. That’s the nuance where we’re applying a values-aligned layer over top of the existing tools, not trying to create a replacement for what’s already out there.

Having been a new parent relatively recently, I’m very familiar Googling, “How do I get my kid to do X, Y, or Z?” I’m sure you’re right that this is a common way people are using these tools. Is it inevitable, though? When people speak about technology—and not just about AI—often they speak in terms of inevitability: “It’s coming. You can’t stop it. You just have to make the best of it.”

And there is some truth to that. Certainly, at the individual level, I have no control. But I question it at the societal scale. I think you see a reconsideration of this kind of thinking with social media, as we’ve belatedly realized its downsides. We do have agency, actually. This stuff isn’t inevitable, and adapting and redeeming are not the only options.

I’m not a pastor, but if I were, I would not want my congregants to inevitably turn to computers for answers to weighty questions about parenting as a Christian. So are we resigned to a future in which people will take their problems to computers and all we can do is make sure they take them to computers with our values?

It’s so easy to sympathize with a pastor, a parent—anyone who wonders, “Hey, is this a foregone conclusion? Can we not stop the train?”

Any child growing up today is never going to live in a world that doesn’t have AI. And I think right now, today, in first half of 2025, generally when you’re interacting with these conversational applications or these generative AI tools, you know what you’re doing—that is, I purposefully opened the ChatGPT app on my phone. I know that I’m using it.

But I think what we’re going to see as time goes on is that AI will become more and more embedded in more and more things that we do and use every day. It’s going to be in our cars. It’s going to be on our phones. It will be how we consume entertainment, education, what have you.

My belief—and, I think, the way Gloo would look at this—is that we see both an opportunity and a responsibility, not just for pastors but for all people of faith. It’s a renewed call for discernment, right?

I think that maybe an effort to stop young people from using AI tools could be workable for some period of time. But what’s probably more likely and admittedly more nuanced is that we need to teach them how to responsibly use those tools for good.

With my own children, we have conversations about what they’re learning, what answers they’re getting back from AI, and how those answers square with what they know to be true. Because we’ve talked about biblical truth. We’ve talked about principles outside what they’re encountering in those tools. There’s an opportunity for pastors, parents, and educators to equip and arm people to have a truth detector and to understand that when you get answers back from AI, you don’t have to accept those outright. You should question and be curious.

Hopefully that kind of discernment is something we’ve always had, but the stakes are higher now with these tools. They speak realistically, and they give us answers that sound right, but that information requires more inspection sometimes.

I share that hope as well—that we will be able to develop that kind of discernment. I do wonder, though, what it communicates if on the one hand we’re saying, “Hold these tools at an arm’s length. Test what you’re hearing against the truth,” but then on the other hand, when I come to my church’s website, it has an AI chatbot. That’s my church, so why would I doubt it? Isn’t that a confusing message? “Be circumspect about AI and chatbots,” but also “Here’s our chatbot. It’s good.”

That’s a great point. And I think you’re really getting to matters, ultimately, of trust. We should be able to trust our religious institutions or houses of faith, and they historically have been a place where trust is held in high esteem. It can also be quickly broken.

AI tools give churches now a tremendous ability. A small church using AI can create content and repurpose content in forms and languages that would’ve been out of reach only three or five years ago. A small church of 50 people could now have a Spanish-speaking ministry if they chose. That’s a wonderful thing.

But to your point, if that church is choosing to use tools that are exposed to the congregation, I would agree there’s a responsibility for the church to know that tool is in fact trustworthy and does adhere to the values that the church espouses. That does create a higher burden—a higher bar, if you will—for church leaders before they put a tool out there. They need to ensure that’s an invitation to test, to explore, to be curious, to try to break the tool and make sure it’s going to answer the way the pastor would answer. Otherwise, maybe it’s not ready to go out.

Our Latest

The Bulletin

An Alleged Drug Boat Strike, the Annunciation Catholic School Shooting, and the Rise of Violence in America

The Bulletin discusses the attack on an alleged Venezuelan drug boat and the recent school shooting at Annunciation Catholic Church in the context of politics of violence.

The AI Bible: ‘We Call It Edutainment’

Max Bard of Pray.com details an audience-driven approach to AI-generated videos of the Bible, styled like a video game and heavy on thrills.

Review

A Woman’s Mental Work Is Never Done

Sociologist Allison Daminger’s new book on the cognitive labor of family life is insightful but incomplete.

News

In Rural Uganda, a Christian Lab Tech Battles USAID Cuts

Orach Simon tests blood and finds hope amid suffering.

From Our Community

Storing Up Kingdom Treasure

Greenbriar Equity Group chairman and founding partner Regg Jones urges fellow Christians to invest in the next generation of Christ followers.

Gen Z Is More Than Just Anxious

What the church gets wrong—and what it can get right—about forming a generation shaped by screens and longing for purpose.

Don’t Pay Attention. Give It.

Attention isn’t a resource to maximize for productivity. It’s a gift that helps us love God and neighbor.

Faith-Based Education Is Having a Moment

I’m excited to see churches—particularly Black congregations—step boldly into teaching.

Apple PodcastsDown ArrowDown ArrowDown Arrowarrow_left_altLeft ArrowLeft ArrowRight ArrowRight ArrowRight Arrowarrow_up_altUp ArrowUp ArrowAvailable at Amazoncaret-downCloseCloseEmailEmailExpandExpandExternalExternalFacebookfacebook-squareGiftGiftGooglegoogleGoogle KeephamburgerInstagraminstagram-squareLinkLinklinkedin-squareListenListenListenChristianity TodayCT Creative Studio Logologo_orgMegaphoneMenuMenupausePinterestPlayPlayPocketPodcastRSSRSSSaveSaveSaveSearchSearchsearchSpotifyStitcherTelegramTable of ContentsTable of Contentstwitter-squareWhatsAppXYouTubeYouTube