Two months after Australia enacted its Social Media Minimum Age (SMMA) law, Snapchat reports it has disabled or locked over 415,000 accounts belonging to users in Australia who either declared an age under 16 or were identified as likely underage by Snapchat’s age detection technology. The company says it continues to lock more accounts daily as part of ongoing compliance efforts.
Snapchat acknowledges the limitations of the current law, particularly regarding accurate age verification. According to the company, “the Australian government’s own trial, published in 2025, found that available age estimation technology was only accurate to within 2-3 years on average.” This creates a risk that some users under 16 may bypass protections while some older teens could lose access by mistake.
Another concern raised by Snapchat is that the law does not cover all apps and services young people use. “Young people won’t stop communicating when they lose access to regulated services. Over 75% of time spent on Snapchat in Australia is messaging with close friends and family. We’re concerned that when young people are cut off from these communication tools, some may turn to alternative messaging services that are not being regulated — services that may be less well-known and offer fewer safety protections than Snapchat provides. While we don’t yet have data to quantify this shift, it’s a risk that deserves serious consideration as policymakers evaluate whether the law is achieving its intended outcomes,” the company stated.
Snapchat has proposed app store-level age verification as a possible solution. The company believes this approach would provide “more consistent age signals for each device,” reducing errors in enforcement and extending protection across all digital services instead of select platforms only. “By creating a more universal foundation for age assurance, app store-level verification would help ensure that young people encounter appropriate protections no matter where they go online,” Snapchat said.
The company reiterated its position on the SMMA: “We want to be clear: we still don’t believe an outright ban for those under 16 is the right approach. We understand the Australian government’s objectives and share the goal of protecting young people online. But in the case of Snapchat — which is primarily a messaging app used by young people to stay connected with close friends and family — we do not believe that cutting teens off from these relationships makes them safer, happier, or otherwise better off. We fundamentally disagree that Snapchat is an in-scope age-restricted social media platform.”
Despite disagreements with parts of the policy, Snapchat states it remains engaged with regulators and supports changes aimed at improving implementation while minimizing negative effects: “If Australia is going to pursue this approach, it should be done in a way that offers greater protection to young people with fewer downsides. Creating a centralized verification system at the app-store level would allow for more consistent protection and higher barriers to circumventing the law.”
Snapchat also highlighted its ongoing commitment to safety features for younger users globally and specifically in Australia. The platform requires bi-directional connections for private communication and maintains dedicated Trust & Safety teams operating around-the-clock, including staff based in Sydney.
Parental controls have been expanded through Family Center tools. Parents can now monitor their teen’s usage time on different features within Snapchat and see how new friend connections are established—information designed to help families guide safe online interactions.



