Roblox is getting sued—so far by three state attorneys general and in at least 42 cases brought by concerned parents.
But the platform’s popularity continues to explode. In 2024, some 82.9 million users played every day. The most recent report from Roblox Corporation ratchets that number up to 151.5 million. Nearly 50 million of those users are signed up as under 13 years old. Some are as young as five.
Unlike Rocket League or Super Smash Bros., Roblox isn’t just one universe. It’s a platform of over 7 million “experiences,” all created by users. Players move through various worlds as customizable avatars, sometimes accomplishing tasks or competing in games and encountering other users along the way.
Christians have long been concerned about the risks of popular entertainment, from rock music to wizard books. In 1990, Focus on the Family started the publication that would become Plugged In, which today reviews entertainment from YouTube videos to feature films to flag bad language, violence, and sex.
Roblox is different. Not only does it raise content concerns—it also poses real-life safety risks, as alleged by those dozens of lawsuits. In recent years, bad actors have gained minors’ trust by impersonating other children, convincing their victims to adjust parental controls or move their conversation onto another app like Snapchat or Discord, where they may solicit explicit photos or engage in other criminal behavior.
Predators have exploited the narrative of one Roblox game in particular, Forsaken, to convince children and teenagers they will receive a second life in return for obeying commands like carving a symbol into their skin or undressing in front of a web camera. Over 2 million users have designated Forsaken as a “Favorite,” and the game is often listed in “Top Trending” on the Roblox charts.
Some digital-safety advocates say Roblox is a fundamentally dangerous place for young people.
“There is nothing darkening childhood like internet-connected devices,” said Chris McKenna, father of four and founder of the group Protect Young Eyes. “The church should be leading this conversation. Christian parents should be the most aggressive when it comes to choices of where their kids go online.”
Protect Young Eyes advocates for phone-free schools, puts out how-to guides for setting up parental controls on different devices, and reviews apps popular among children. It rates Roblox as high risk across the board, including on factors like predator risk and nudity risk. McKenna says the platform is unique in that it allows users to share their own creations. Many are innocuous, featuring dragon training, superheroes, track and field, or even Bingo. But others expose minors to explicit material.
Roblox “prohibit[s] content that depicts sexual activity or seeks real-world romantic relationships” and uses mature-content labels for violence or adult humor. (A nine-year-old can access experiences rated “minimal” and “mild” but not “moderate” or “restricted.”)
Every day, Roblox removes 130 million pieces of content, chat messages, and usernames from the platform for violating their policies. “We work to help over 150 million users have a safe, positive, age-appropriate experience on Roblox,” a representative from Roblox told CT. “Our layered safety systems combine advanced AI, 24/7 human moderation, and collaboration with law enforcement and safety experts to help detect and prevent harm. While no system is ever perfect, we constantly innovate and invest to set the standard for online safety.”
Still, there have been instances of children encountering sexual content.
“If the definition of sin is missing the mark with our choices, our thoughts, and our behaviors, are we putting our children in the Devil’s cross hairs?” McKenna asked. He points to Roblox’s business model—adding users through frictionless onboarding in order to increase in-game purchases like clothing for avatars or special abilities in games—as the reason stricter safety measures haven’t been adopted.
For instance, as it stands, kids who create a Roblox account are asked to verify their age, but can type any number without follow-up confirmation. Without a verification process, a child can start an account as an adult; an adult can start an account as a child. (Roblox plans to tighten this verification protocol in 2026.)
Attorney Melinda Maxson—who represents a Nebraska John Doe in a case against Roblox—has been concerned about the platform’s safety issues for decades. Maxson has represented Roblox-related cases for 20 years, from clients who used the platform in its earliest days (it launched in 2004) to those who allege recently being groomed. All this time, she said, the platform has claimed to be safe for children. But that hasn’t been the case. “There’s a chat function within their experiences that allows perfect strangers to reach out and communicate to other users on the platform,” Maxson said.
Roblox does tailor chat functionality based on a user’s age. The restrictions are complex. Users under 13 cannot access private text and voice chat with other users. However, the “experience” chat feature allows users in the same experience to communicate with other similar-aged users in a public chat, which others can see. In several cases, an adult allegedly used the public chat chat to convince a child to move their conversation to Discord.
Users 13 or older can also type more words and phrases in their chats than those under 13 can. These younger users’ chats are also filtered to prevent personal information from being shared. (All chats, regardless of age group, are filtered for sexual language, harassment, discrimination, threats, and incitements to violence and monitored for attempts to move conversations to another platform. Sending images isn’t allowed.)
Roblox recently announced a soon-to-be-released safety feature that will require age verification—either with AI-enabled facial recognition or ID uploads—for any users who wish to exchange messages. Approved users will be sorted into six brackets and allowed to chat only with users in their age categories and those adjacent. (It seems that a 10-year-old will be able to talk to a 14-year-old but not a 16-year-old.) Intended to limit interactions between minors and adults, the protocol will take widespread effect in early 2026. “We are sharing what we believe will become the gold standard for communication safety,” said the company’s press release.
But “the proof is going to be in the pudding,” attorney Maxson said. “Until we see how it operates and how effective it is, I remain very skeptical because of course they’ve had 20 years to put these types of safety mechanisms in place.”
Bennett Sippel, a researcher for Tech and Society Lab out of NYU Stern and writer for the Substack After Babel, said the verification feature makes progress on one very important issue by preventing predators from accessing children through chat messages.
But once the new age-verification update is in place, it will still be possible (though more difficult) for a teenager to access unfiltered chat with adults. And according to the press release, users of different ages will be able to communicate if they name each other as a trusted connection. Once both parties agree to the pairing using contact info or a QR code, they can access voice chat and chat without filters, “allowing for more natural and direct communication.”
Sippel noted these safety changes are coming after lawsuits as “damage control” rather than as a proactive approach from Roblox. He still has concerns about the platform’s ability to moderate explicit content—in 2022, more than 15,000 experiences were uploaded per day. Roblox may also be a gateway into a gambling addiction, both to gameplay itself and to casino games. (Though gambling isn’t allowed on the platform, it does feature “loot boxes,” random generators that operate like slot machines.) And contact with strangers is risky even if it isn’t explicitly predatory.
“A healthy childhood involves a deep participation in reality—a deep independence and freedom in the real world—and we’ve inverted that,” Sippel said. “We’ve given them free reign over this digital universe and then we’ve overprotected them in the real world, where those real experiences are really what crafts a healthy human being.”
Despite the safety risks, some Christians see digital spaces like Roblox—if precautions are taken—as missional opportunities to express their values or even evangelize. One user created a game in which users role-play as King David or Abraham. Another teaches “The Jesus Story.” The Christian video game company Soma Games recently developed The Wingfeather Saga, a Roblox adaptation of Christian author Andrew Peterson’s popular book series and animated show.
Chris Skaggs founded Soma as well as the Imladris community, an ecosystem of Christians in the game-development industry. Skaggs sees Roblox as a neutral platform, comparable to a toy store—there are items you don’t want your kid to buy, but the store itself isn’t the problem. Games, he believes, can prepare players, some who might never set foot in a church, for the gospel. For him, the 150 million daily Roblox users aren’t a problem; they’re an opportunity for witness.
“It’s not just that good and evil exist but that good is God,” Skaggs said. “There is a deep reality [in The Wingfeather Saga]: … God to Satan to angels and devils, how humanity is fallen … that grace and sacrifice can bring back restoration and renewal. So it’s not just good and evil. It’s deeper than that. You explain where good and evil come from.”
Sometimes, that explanation comes from the same young people adults worry about on the platform. The Robloxian Christians group was founded by Daniel Herron when he was only 11 years old. At one time boasting 54,000 members, the community described itself as a “global youth-led church committed to sharing the love and word of God with all young people online,” offering experiences like God is Love, Home Church, The Nativity, and [TRC] Worship Theatre.
“It’s basically a virtual island hovering in a cosmic stratosphere,” wrote Herron in a description of God is Love, “with a few flowers blooming and a cross draped in a burgundy stole and illuminated from above by glorious rays. Nearby is a stone table with bread and wine. Visitors can hear a piano quietly playing as they pray and discuss their faith with others.”
Herron told CT over email that the church closed in 2023 “after twelve years of ministry on Roblox and as one of the world’s first youth-led online churches.” Herron is now a board member for the Covenant Network of Presbyterians, and he believes Roblox is “a generally untapped platform” for would-be evangelists.
Chris Skaggs, the Christian game developer, does think Roblox needs to enact more safety features. He suggests a companion app that would give parents the ability to oversee their children’s gameplay and to restrict access to certain games.
Currently, if parents create linked accounts, they are able to set parameters on what their children can view on Roblox. Through parental controls, parents can block specific games or users, set screen-time and spending limits, manage their children’s access, and see their children’s top 20 experiences from the last week. Skaggs’s suggestion would step up these monitoring systems, allowing parents to watch recordings of their children’s gameplay, including their chats with other users.
Chris McKenna from Protect Young Eyes thinks kids are better off without Roblox but said, “If you’re going to say yes, at least take some steps to mitigate that risk.” He said children should never play with the chat on, with headphones, or alone. Gameplay should be restricted to certain devices, at certain times of the day, and with certain friends in a closed network. McKenna suggested following what he calls the “seven-day rule”: Before allowing a child to use any platform, experience it for yourself for at least a week.
Bennett Sippel from After Babel suggests Minecraft and Fortnite as better—though not perfect—alternatives to Roblox with better guardrails, and he even more strongly recommends Nintendo games like the Mario series and The Legend of Zelda. He says parents should appeal to collective action in order to move kids off harmful platforms. If several families in a community aren’t playing a certain game, then children will be less worried about missing out.
“No matter how much we try to work as parents to raise [our kids] in virtue and protect their ethics as they grow up, we’re competing with this virtual world that’s really doing quite the opposite,” Sippel said. “Not to get over-hysteric about it, but we pay with our souls. We really don’t want to be paying with our children’s souls.”
Isaac Wood is a journalist who produces narrative podcasts in East Tennessee. His work has appeared in The Dispatch, Civil Eats, Ministry Watch, and 100 Days in Appalachia, among other outlets. He was a member of the inaugural class of the CT Young Storytellers Fellowship.