Starting with pilots in December and expanding worldwide in January 2026, Roblox will require users to pass a facial age-estimation or ID-based check before unlocking private chat and higher-risk communication features. Users are sorted into six age bands—from under 9 to 21+—and chat permissions are limited to tightly defined cohorts, according to a report by AP News.
Roblox, which reports more than 150 million daily active users, is moving under sustained pressure from regulators and child-safety advocates, including lawsuits in the U.S. and negotiations with Australia’s eSafety Commissioner, as reported by The Times of India. Persona, the verification vendor, handles video-selfie analysis and ID checks and says biometric data is deleted after processing—mirroring privacy positioning from age-assurance players like Yoti, which powers similar checks for Instagram.
The move comes as lawmakers from New York to Malaysia push platforms like TikTok, Instagram and others to implement more robust age verification under new “kids online safety” and addictive-feeds rules, according a report by AP News. The online age-verification market itself is projected to grow from roughly $2.5 billion in 2024 to $7.1 billion by 2033, a 15.7 % CAGR, according to a Verified Market Reports
Roblox’s decision puts it slightly ahead of some peers on explicit, mandatory checks—but it also exposes a fault line regulators are already probing: accuracy gaps and bias in facial age estimation, and whether “age estimation” is enough where laws may demand hard age verification, as The Guardian points out.
Stay on point and visit the latest SecPo Marketplace headlines.