According to his Spotify profile, Solomon Ray is a “Mississippi-made soul singer carrying a Southern soul revival into the present.” His most recent release, a Christmas EP called A Soulful Christmas, features tracks with titles like “Soul to the World” and “Jingle Bell Soul.”
Ray is a verified artist on the streaming platform, draws over 324,000 monthly listeners, and became the top artist on the iTunes top 100 Christian and gospel albums chart this week.
But Solomon Ray (at least, this Solomon Ray) is not a real person. Artificial intelligence crafted his persona, voice, performance style, and lyrics.
AI-generated music is no longer a laughline or niche tech interest. More listeners are encountering songs created by algorithms and machines, whether by choice or by accident.
Last week, an AI-generated song—“Walk My Walk” by Breaking Rust—topped Billboard’s Country Digital Song Sales chart and hit 3 million streams on Spotify. Music enthusiasts now worry whether their new favorite song or artist is actually real.
Spotify released a statement in September pledging stronger impersonation protection for artists, an improved “music spam filter,” and intensified scrutiny of “deceptive content.” But these policies still allow publicly anonymous creators to post AI-generated artist profiles and music.
When the AI-generated band The Velvet Sundown surfaced over the summer, fooling droves of listeners with its retro ’70s-flavored rock, streaming platforms did not roll out features to label the content as AI-generated. When it comes to distinguishing AI-generated music, listeners are on their own.
Meta, the parent company of Facebook and Instagram, expanded its AI content moderation policies last year, mandating that the content be labeled.
A video posted to Solomon Ray’s Instagram account featuring a rendition of “Silent Night” shows a Black man wearing a gold cross necklace, maroon suit, and tan fedora singing in front of twinkling lights.
There are some hints of AI animation that viewers might miss. A small note at the bottom of the video designates it as “AI info.” When clicked, a pop-up states that the profile’s owner has added an AI label to the video and that “AI may have been used for a wide range of purposes, from photo retouching to generating entirely new content.”
One viewer of the music video commented, “Is your music AI?” Another wrote, “This is AI. The lyrics are beautiful but it feels so wrong coming from a thing without a soul.”
Earlier this week, a worship leader named Solomon Ray started getting texts from friends, asking about his new music and marveling at how quickly it was climbing the Christian charts. That Solomon Ray is a real person, who also recently released an album and a Christmas single.
“At first, I didn’t understand what was going on,” said Ray, who leads worship at Fresh Life Church in Kalispell, Montana. He also records music as “Solo Ray” and has worked as a freelance producer and session musician for worship artists like Phil Wickham, Elevation Worship, Chris Tomlin, and Pat Barrett.
“Some friends were texting and calling because they thought it was funny, others were reaching out because they thought it was really me.”
The sudden success of an AI-generated artist on the Christian charts has sparked an online conversation about the ethics and theology of computer-generated art. After Solomon Ray’s EP took the No. 1 spot on the iTunes chart, Forrest Frank (who also released a Christmas single last week) weighed in on social media.
“At minimum, AI does not have the Holy Spirit inside of it,” Frank said. “So I think that it’s really weird to be opening up your spirit to something that has no spirit.”
The real person behind the character of Solomon Ray is Christopher “Topher” Town, a conservative hip-hop artist involved with Veterans for Trump, who performed at a rally that coincided with the January 6 storming of the Capitol in 2021. Following the event, Town’s song, “The Patriot,” was removed from Spotify, and Instagram banned him from going live on the platform.
In a video posted to Instagram on November 19, Town called himself “the man behind the machine.” He disagreed with Frank’s claim that Solomon Ray’s music has “no spirit,” saying, “This is an extension of my creativity, so therefore to me it’s art. It’s definitely inspired by a Christian. It may not be performed by one, but I don’t know why that really matters in the end.”
Town said that God can use any vehicle to reach people, even AI, and accused Frank of doing “more gatekeeping” than “uplifting.”
The real Solomon Ray chimed in on Frank’s video. “Should Forrest and I reclaim Christmas from the robots?” he asked.
Ray said that the last week has been amusing but also a little disheartening. “It bums me out to see people get hoodwinked by AI,” he told CT.
As a musician and producer, Ray said he immediately realized that the other Solomon Ray was AI-generated when he listened to the music. But he knows that most listeners wouldn’t necessarily hear the tells.
“There’s something in the high end of the vocals that gives it away,” said Ray. “And the creative choices sound like AI. It’s so precise that it’s clear no creative choices are really being made.”
As a worship leader, Ray says that, in his view, AI-generated music will never offer what the church needs. “How much of your heart are you pouring into this? If you’re having AI generate it for you, the answer is zero. God wants costly worship.”
Long before AI, Christians have debated how much to bring machines into music and worship.
Christian traditions like the Church of Christ have historically objected to the use of mechanical musical instruments in congregational music. Churches with high-production contemporary worship have to navigate the use of tools like autotune and click tracks.
AI-generated Christian music introduces new facets of the debate that seem to have most to do with what the music is for. Town reflects a utilitarian view of Christian music: that it is a vehicle for the gospel above all else. Frank takes more seriously the “inspirational” potential of Christian music as a cultivator of transcendent encounters with God.
For Christian artists and industry insiders, the conversation about AI-generated music is also about community and the creative value of making music, which can be transformative and spiritually meaningful for the humans involved.
Historically, though, the Christian music industry and its fans have leaned toward a utilitarian view of Christian media. Historian Leah Payne wrote that CCM has always been “part business, part devotional activity, part religious instruction.” Perhaps AI-generated music is simply another technological advancement—like radio, television, or social media—to be harnessed for the dissemination of gospel-centered entertainment.
In his new book AI Goes to Church, Todd Korpi suggests that “the biggest threat to creation at the hands of AI is in how it continues to feed our appetite for consumption and progress.” AI-generated music is faster and easier to produce than a studio album that requires real musicians, songwriters, and audio engineers—the relational part of making music. Korpi cautions that the use of AI might “continue the trend of disconnection and preference for the convenience of disembodied interaction that has shaped the last decade.”