Discord's big update hits in March 2026: every account switches to teen mode by default, blocking age-restricted servers, channels, and features like speaking on Stages or seeing unblurred sensitive stuff. Friend requests from strangers get warnings, and DMs from unknowns land in a separate inbox. To unlock full access, you verify as 18+ via video selfie (AI-processed on your device) or ID upload to partners, which they delete fast. But here's the key: Discord's "age inference" AI checks most users automatically using signals like account age, device type, activity hours, and game metadata, skipping personal messages.
They claim the vast majority of adults pass without lifting a finger, keeping things seamless. Only if the AI doubts you do you get prompted. Privacy stays private, no one sees your status. Backlash hit hard after the first announcement, with gamers threatening uninstalls over fears of mandatory scans or ID shares, especially after last year's vendor breach leaked some data. Discord clarified to calm nerves, but critics call the AI profiling sneaky surveillance anyway. Does it make sense? It's a response to global kid-safety laws, like other platforms do, aiming to protect teens without nuking adult fun.
Rollout's phased, building on UK/Aus tests, plus they're adding a Teen Council for input. Balanced take: handy for parents, annoying for privacy fans, but probably not the end of Discord for most.





