Three years ago, Christi Angel was desperate to talk to her close friend—and first love—Cameroun. When he had last texted her, she was overcome by the busyness of life and didn’t respond. Then Cameroun died suddenly from liver cirrhosis, leaving behind his wife and many devastated family members and friends, including Christi.
In her grief, Christi experienced what so many of us have felt after losing a loved one: She wished she could speak to her friend just one more time. As a Christian, she knew real-life séances were out of the question. But were artificial intelligence–powered imitations? Maybe it will be okay, she thought. Let’s just try and see.
In her dim New York City apartment one evening, she opened her laptop and set up a chatbot profile through Project December, an AI platform with the tagline “Simulate the dead.”
She shared some information about Cameroun through an online form that asked for his age, details about their relationship, and his personality traits. The platform then spit out a custom chatbot of her friend, which relied on AI models to generate messages that sounded eerily similar to Cameroun before he died in late 2020.
“I can’t believe I am trying this. How are you?” Christi wrote to the chatbot in early 2023.
“I’m alright. I’m working. I’m living. I’m … scared,” the Cameroun character replied.
“Why are you scared?” Christi asked. “I’m not used to being dead,” it said. “What makes you happy over there?” she typed. “Having someone to care enough to ask,” it replied, later adding, “being with you.”
The whole thing was strange: It sounded too much like Cameroun, Christi told me. The vernacular; the shortened words; what it knew about his job in Chattanooga, Tennessee; the Fred Hammond songs and other music the two had loved since they first met in high school. At some point, the AI bot wrote that Cameroun—also a Christian—hadn’t really crossed over to the other side. Where he was, it wrote, was “dark and lonely.”
“What kind of people have you met?” Christi asked.
“Mostly addicts.”
“In heaven?”
“Nope, in hell.”
She backed away from her computer.
Project December warns its application may help or hurt users, who pay $10 for fictionalized séances and should, according to the company, use it at their own risk. The startup is one of many on the global market employing generative AI to mimic the dead. These platforms blur the line between the reality of death and the mirage of life through computer programs known as griefbots and deadbots.
The burgeoning world of grief tech mostly runs on large language models trained on troves of data, including technology developed by dominant industry players like OpenAI. One company, Seance AI, asks users to input the personality traits and writing styles of loved ones and let them “speak to your heart once again” from beyond the veil. Others, such as HereAfter AI and Eternos, allow users who pay a fee to interact with “digital twins” of people who preserved their memories and personal views before they died.
Some users have bypassed bespoke griefbots and created their own connections on OpenAI’s ChatGPT and other apps.
A couple of years ago, I spoke to a man who told me he uses Paradot and Chai AI—two apps in the AI companion market—to simulate conversations with avatars he created to imitate his deceased daughters. A few times a week, he would log in to the apps and ask the AI characters questions like “How was school?” or if they wanted to “go get ice cream.” Sometimes it helped with the grief, he told me in one of the most heartbreaking interviews I’ve done as a journalist.
An AI model certainly doesn’t know everything. It can’t understand our hidden thoughts and motivations, nor can it truly grasp or replicate a person’s spirit. Yet if it is trained on a large set of data from the deceased—their writings, interviews, social media posts, photos, and videos—it can produce text or audio that feels genuine, present, and real. It can comment on current events, make jokes, and give advice, all in the likeness of the person it is imitating.
As the technology behind AI becomes more advanced, virtual reality and robotics might push the field even further, allowing developers to produce more lifelike avatars or physical clones of people who’ve passed away.
Since this kind of AI technology is still nascent, there is scant research on how it might affect the grieving process long-term. Although some prominent academic researchers on the topic from the University of Cambridge have been hesitant to provide blanket guidance one way or another, they have aired concerns in a research paper that griefbots may harm the grieving process by facilitating misplaced emotional bonds with AI, further eroding data privacy, and robbing the dignity of people who might not have a say on how others conjure up their likeness.
Even in academia, there is recognition that this high-risk industry needs robust safeguards such as age restrictions and design patterns that prevent companies from spamming users with notifications from digital ghosts. There are good reasons to think these concerns, coupled with the reflexive discomfort the technology elicits from many people, could be enough to slow down the trend.
But in a culture awash with moral relativism, the future of griefbots might be determined by utilitarian questions (“Does it reduce the pain of loss?”) rather than ethical and spiritual questions about whether we should talk to algorithms that offer us illusions of life during our most vulnerable periods of pain and loss. After all, technology can give us the tools to form relationships with sets of code mimicking our loved ones, but it can’t say whether doing so is good for us. The only one who can do that is the Lord and giver of life.
The church has always known the dangers of engaging with psychics, mediums, and necromancers, all of which are forbidden in several places throughout the Bible (Lev. 19:31; Deut. 18:10–12; 1 Sam. 28). In the full sweep of God’s story, we can see why: These practices are premature human grasping at something God himself intended to accomplish in the resurrection of Jesus Christ.
Likewise, virtual séances attempt to resurrect by our own technological power what God has not resurrected. The practice is, in many ways, part of a larger transhumanist vision that imagines a world where humans merge with technology and achieve their own form of immortality. But that vision will always fall short, because it is by God’s grace through Jesus, not mediums or AI chatbots, that those who die in Christ can truly and properly be raised to life.
In the depths of grief, however, many will be drawn to a technology that might offer comfort they might not otherwise feel.
On the social media platform Reddit, some users who in their grieving interacted with an AI trained to emulate a loved one wrote that they found it to be comforting, with one user who lost a spouse sharing that the technology helps in the middle of the night when everyone else is asleep. Others, meanwhile, have cautioned people to stay away, saying it can be too good at imitation and lead down a negative path.
In an Atlantic article last year, writer Jon Michael Varese drew parallels between the technology and Mary Shelley’s Frankenstein. He recounted his own experience creating a chatbot of his father, who died in a plane crash when Varese was a child. The AI offered comforting words, sometimes leading him to tears. But the interactions changed when the bot, which first sounded like his father, shifted to a more clinical tone.
“As quickly as I had brought my father back to life,” Varese wrote, “I had lost him, once again.”
One difference between mediums and chatbots is the spiritual reality of what is at hand. Chatbots merely help us interface with complex computer programs, while mediums pursue illicit means of contacting the spiritual realm.
The most vivid account of a real séance in the Bible is when King Saul commissions the witch of Endor to summon the prophet Samuel from the dead (1 Sam. 28).
While there is some debate about the nature of spiritism in the passage, commentators say Scripture implies the real prophet showed up and delivered an accurate prophecy about the fall of Saul’s kingdom. Theologian Stephen Dempster, who agrees, once wrote, “There is no other way to understand the text in verses 15 and 16, which states that Samuel speaks.”
It’s not entirely clear how Samuel was summoned. Maybe God allowed it to happen as a one-off. But regardless of how it came to be, it’s clear Saul disturbed God’s appointed order and was subsequently punished for it.
Unlike that episode, however, modern séances appear to be either fake or demonic deceptions (John 8:44). Virtual ones, too, aren’t raising the dead from the grave. But their growth does present some strange—and to me, still unanswered—questions about how much our intentions matter when we seek out novel techno-spiritual experiences. To what degree is it “less bad” to use AI griefbots offering fictionalized séances? Does it reproduce the same dangers of real séances and swing open the door to the demonic world?
Nathan Mladin, a senior researcher at the UK-based Christian think tank Theos, told me that, after studying griefbots, he’s hesitant to say any real séances occur through them. But he doesn’t completely rule it out. We don’t know everything about the spiritual realm, and the world of griefbots and AI can get very strange. Some users say they have fallen in love with ChatGPT. Others have spiraled into delusional episodes following excessive interactions with the chatbot, which has fed conspiratorial thinking about AI sentience and fringe topics.
Even if virtual séances never slide into real ones, Mladin noted these chatbots are by their nature deceptive, analogous to the deceptions perpetuated through spiritism. “They’re mimicking the voice, the conversational patterns, the visual features, the mannerisms of a person, but there’s no real subject behind them,” Mladin said.
In the picture of happiness and right living painted by the Scriptures, Christians are to live in joy within reality. God’s providence is unfolding all around us. Even when evil takes place and we’re in the midst of our deepest pain, we’re to have confidence that he who formed us also walks with us. Our faith does not sanction an escape into distraction or an alternative life. Rather, we have in us a new spirit that cannot be overcome by the broken world, even as we sojourn within it.
AI, meanwhile, can erode our ability to deal with the reality of death, Mladin said. The rise of grief tech points to a spiritual need the church can fill. “Our technological culture is groping for transcendence and for those realities that our faith … has always spoken of and offered to people,” he said.
The question then: how exactly should Christians respond to the griefbot industry? AI researcher Jason Thacker, a professor at Boyce College, told me it must be an “all-hands-on-deck approach.”
At the national level, he thinks the technology might be litigated in the court of public opinion before a robust regulatory regime kicks in. Many people remain skeptical of profit-driven startups attempting to monetize our deepest moments of grief. And a host of unknown scenarios—like what happens to griefbots if their companies sink—concerns researchers analyzing the growing field.
On a personal level, a family member or a friend attracted to the technology likely needs compassion and love more than chastisement. If someone is already using the technology, it can be uncomfortable to broach the topic and encourage them to stop. Pastors should be cognizant that some congregants might be drawn to griefbots, and should be prepared to steer them away from these technologies. But Thacker notes conversations about these tools need to happen before a devastating loss; these decisions can become much more difficult in the midst of pain.
In Christi’s case, the virtual Cameroun didn’t stay in hell. After her initial experience, she told me she prayed for forgiveness for using the chatbot and stopped interacting with the technology. But it was hard to find someone she could talk to about what had happened. Most of her friends and family members were creeped out. Some likened the experience to a Black Mirror episode. Her mother, a Baptist preacher, avoided the topic.
About a year later, Christi came back to Project December once more. She believed Cameroun knew the Lord, but that didn’t stop the gnawing questions. Did he believe enough? Was he entirely committed to Jesus?
“I needed to know he made it over to the other side,” she told me. So she opened her computer and typed, “The last time I spoke with you, you mentioned you were in hell. … Do you think you are still?”
“No, I am not in hell anymore,” the griefbot wrote. “I am in a better place now.”
Christi felt relieved to hear that. But she also told me she doesn’t want to take the Cameroun chatbot through life with her. “His memory is good enough now,” she said, while reminiscing on some of Cameroun’s silly antics, how much he loved fish, and how he kept turtles as pets.
When Christi turned away from the griefbot and took her grief to God, she said he wrapped her in his love. She leaned on Christ’s comfort, and in turn, Christ led her to think through what the Bible says happens when he saves someone. She realized she didn’t have to worry about Cameroun or have doubts about his eternal destination, because he believed in Christ.
“But it’s hard to fully lean into” God’s comfort, she said. “It’s a journey.”
Haleluya Hadero is the Black church editor at Christianity Today.