Face scans, ID uploads: What Discord’s new age-verification rules mean for users | Technology News

Social platform Discord on Monday, February 9, announced that it will make age-verification mandatory for all users globally starting next month.
Users will be required to verify that they are over 18 years of age by completing a face scan or uploading an ID, the chat app aimed at gamers and streamers, said. Founded in 2015, Discord offers voice, video, and text chat capabilities. It has more than 200 million monthly active users, according to a December statement on its website.
Discord’s new age-verification requirements mirror a broader shift across social media and AI chatbot platforms, many of which are tightening policies amid mounting lawsuits and growing regulatory scrutiny over online child safety. The clampdown also signals a move toward an age-gated future of the internet, even as many countries, including state governments in India, contemplate a total ban on social media access for teen users. Here’s all you need to know about Discord’s new age-verification mandate.
How will age-checks work on Discord?
Discord has said that it uses account information, device and activity data, and ‘high-level patterns’ across the platform’s communities to estimate a user’s age.
However, in the case that Discord’s age inference model is not able to accurately estimate a user’s age, they will have to verify that they are an adult by either completing a facial age estimation process or submitting an ID to Discord’s vendor partners. The platform has said that it plans to add more age-verification options in the future.
The facial age estimation process requires users to submit a video selfie of themselves. Discord has said that the clips will never leave the device of the user. Additionally, the company said that user IDs submitted to its vendor partners will be deleted quickly and immediately after the process is completed. Users may also be asked to follow multiple methods of age-verification in case additional information is needed to assign them to an age group.
The accounts of users who are not verified as adults or determined to be under 18 will be made into a ‘teen-appropriate’ experiences with certain limitations, such as being blocked from age-restricted servers.
Story continues below this ad
How will a teen account differ from a standard Discord account?
Discord has said that from early March, all users will be put into a ‘teen-appropriate experience’ by default. Users will need to confirm they are above 18 in order to exit the teen mode.
Only adult users on Discord will be able to change certain settings and access age-restricted content, channels, servers, and app commands. Users may be asked to confirm that they are adults if they want to unblur sensitive content or turn off the setting, and only adults will be able to speak onstage in servers.
Additionally, messages sent to an under-18 user from an unknown account may be routed to a separate inbox. These accounts will also be shown a warning prompt if they receive a friend request from users they may not know.
Who will the age-verification requirement apply to?
“For most adults, age verification won’t be required, as Discord’s age inference model uses account information such as account tenure, device and activity data, and aggregated, high-level patterns across Discord communities,” the company was quoted as saying by TechCrunch. “Discord does not use private messages or any message content in this process,” it added.
Story continues below this ad
Following media reports on Monday, Discord issued a further statement clarifying that not everyone will be required to complete a face scan or upload an ID to use the platform.
“The vast majority of people can continue using Discord exactly as they do today, without ever being asked to confirm their age. You need to be an adult to access age-restricted experiences such as age-restricted servers and channels or to modify certain safety settings,” the company said.
Why is Discord rolling out age-verification for users?
Age-checks on users are emerging as the standard response by social platforms in response to pressure from around the globe. While age-verification measures have so far been rolled out in select countries, major platforms such as Discord are now implementing them globally. The company had decided to establish age checks for users in the UK and Australia last year.
“Rolling out teen-by-default settings globally builds on Discord’s existing safety architecture, giving teens strong protections while allowing verified adults flexibility,” said Savannah Badalich, head of product policy at Discord, in a press release. “We design our products with teen safety principles at the core and will continue working with safety experts, policymakers, and Discord users to support meaningful, long term wellbeing for teens on the platform,” she added.
Story continues below this ad
What are the concerns with age-verification?
In response to the platform’s age-verification requirement, many users expressed frustration and said they were going to leave the platform. Some of them claimed they would cancel their Nitro subscriptions.
Several Discord users also raised privacy concerns. In October last year, Discord confirmed that sensitive data belonging to around 70,000 users, such as government ID photos, may have been exposed to hackers after a third-party vendor that Discord uses for age-related appeals, reportedly suffered a data breach. While Discord said that it has since stopped working with the affected vendor, privacy activists said that the incident re-emphasises their concerns about user data protection.
“EFF is against age verification mandates, including Discord’s new policy requiring face scans or IDs for full access to the platform,” the global digital rights organisation wrote in a post on X. In a separate blog post, EFF further argued that age-verification requirements could lead to censorship and enable surveillance of users while “ruining” their online anonymity.
“Blocking entire communities or resources because of their subject matter does not make the internet safer; rather, it silences the people who rely on those online spaces for life-saving support, education, or safety,” it added.
Story continues below this ad
Additionally, EFF highlighted that age-verification tech is far from foolproof. “In general, most platforms offer age estimation options like face scans as a first line of age assurance. These vary in intrusiveness, but their main problem is inaccuracy, particularly for marginalised users,” it said.
What have other major platforms done?
Amid growing international efforts to strengthen child safety, major online platforms such as Instagram, YouTube, OpenAI, and Anthropic announced similar moves. These platforms also use AI-powered tools to guess the age of users on their services.
Globally, Instagram set the tone in 2022 by requiring underage users who changed their age to over 18 to submit a video selfie, and in 2024, the Meta-owned platform ramped up efforts to identify and place teens into more private accounts.
Earlier this year, Roblox introduced mandatory facial verification of all users for access to chats on its platform. The gaming-cum-social networking app recently revealed that over 45 per cent of its 144 million global daily active users have completed an age-check either through facial age estimation or ID verification.
Story continues below this ad
In July 2025, YouTube launched its age-estimation technology in the US to identify teen users and provide a more age-appropriate experience to them. OpenAI announced that it has rolled out an age prediction model for users last month, to identify accounts that belong to users under 18 years of age. The company said that the AI model is able to estimate a user’s age through a combination of account-level and behavioural signals.




