Discord is the latest company looking to bolster its child safety (again). Starting in March, all users will have a “teen-appropriate experience” by default. Unlocking adult content and age-gated spaces will require a (usually one-time) verification process. However, following blowback, the company appears to have pivoted. It now says its age prediction AI will verify the identities of “the majority of adult users” without requiring a manual age check.
The platform’s big safety update encompasses communication settings, restricted access to age-gated spaces and content filtering. Users who aren’t verified as adults will see blurred sensitive content. In addition, age-restricted channels, servers and app commands will be blocked. DMs and friend requests from unknown users will be routed to a separate inbox.
If you’re an adult who didn’t pass the automatic age check, removing these restrictions will require one of two verification methods at launch. You can take a selfie video for age estimation or submit a government ID to Discord’s vendor partners. (Let’s just hope the age estimations work better than Roblox’s.) The company stresses that the video selfies you submit for age estimation never leave your device. And it claims ID documents sent to its vendor partners are deleted quickly, “in most cases, immediately after age confirmation.”
Although Discord says the process will be one-and-done for most people required to submit manual verification, some may be required to submit multiple forms. Its original announcement stated that additional verification options would be introduced, including an age inference model that runs in the background. However, following backlash to its initial announcement, the company now suggests that the inference model will be the norm for most adults.
“For most adults, age verification won’t be required, as Discord’s age inference model uses account information such as account tenure, device and activity data and aggregated, high-level patterns across Discord communities,” the company wrote in a statement sent to Engadget on Tuesday. “Discord does not use private messages or any message content in this process.”
This isn’t the company’s first attempt at beefing up its child safety measures. In 2023, it banned teen dating channels and AI-generated CSAM. Later that year, it added content filters and automated warnings. Those changes followed an NBC News report that 35 adults had been prosecuted on charges of “kidnapping, grooming or sexual assault” that involved Discord communication.
Alongside today’s changes, Discord is recruiting for a new Teen Council. The group will include 10 to 12 teens aged 13 to 17. The company says this “will help ensure Discord understands — not assumes — what teens need, how they build meaningful connections, and what makes them feel safe and supported online.” This sounds like the corporate equivalent of the parenting advice: “Don’t just talk to your children; listen to them, too.”
The child safety changes will start rolling out globally in early March. Both new and existing users may be required to submit verification for adult content.
Update, February 10, 2026, 5:57 PM ET: This story has been updated to add Discord’s clarification and statement that claims most adults won’t need to conduct a manual age check.

