Snapchat disables 415,000 teen accounts in Australia

Snapchat says it has locked or disabled more than 415,000 accounts in Australia belonging to users under the age of 16, two months after the country began enforcing its Social Media Minimum Age (SMMA) law. The figures place Snapchat alongside Meta, which previously reported blocking more than 540,000 accounts under the same rules, signaling large-scale compliance activity by major platforms.

The Australian legislation, which restricts access to certain social media services for users under 16, has drawn global attention as governments watch how effectively it can be enforced in practice. Platform data suggests significant numbers of teen accounts have been removed, but questions remain about whether the law is meaningfully reducing young people’s social media use.

According to Snapchat, the affected accounts either self-declared an age under 16 or were flagged using the company’s age-detection systems. The company says additional accounts continue to be restricted on a daily basis.

However, Snapchat and industry observers point to ongoing circumvention. Teens can switch to alternate accounts, access services without logging in, or migrate to other apps that fall outside the scope of the law. This disconnect between reported enforcement and real-world usage has raised doubts about the law’s effectiveness.

Snapchat also highlighted technical limitations in age verification. Australia’s own 2025 government trial found that age estimation technologies are typically accurate only within a two- to three-year margin. In practice, this means some under-16 users may still gain access, while some users over 16 may be incorrectly locked out.

One of the central challenges flagged by Snapchat is the absence of a standardized, ecosystem-wide approach to age assurance. The SMMA places obligations on specific, largely high-profile social platforms, while leaving hundreds of other apps either unregulated or ambiguously covered.

Snapchat warned that cutting off access to regulated services does not eliminate teen demand for digital communication. More than 75% of time spent on Snapchat in Australia, the company says, is dedicated to messaging with close friends and family. Restricting access to these platforms may push young users toward alternative messaging services that offer fewer safety controls and less oversight.

This uneven regulatory landscape complicates enforcement and makes it difficult for authorities to assess whether the law is achieving its stated safety goals.

In response, Snapchat is advocating for age verification at the app store level as a complementary safeguard. Under this model, age signals would be applied consistently across devices and apps, rather than relying on each platform to implement its own detection systems.

The company argues that app store–level verification could reduce false positives, make circumvention harder, and extend protections beyond a narrow group of regulated social platforms. It also positions the approach as a potential global standard, contrasting it with country-specific age bans.

Snapchat reiterated that it does not support an outright under-16 ban and disputes its classification as an age-restricted social media platform, emphasizing its core use as a private messaging service. Still, the company says it will continue to comply with Australian law while engaging with policymakers on implementation changes.

Australia’s approach is being closely watched by regulators in other markets considering age-based restrictions. Early results suggest that while large platforms can remove hundreds of thousands of accounts, enforcement gaps, inconsistent standards, and user workarounds may limit the law’s impact.

Written by Sophie Blake

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Loading…

Influencer marketing tops 2026 ad priorities as buyers shift toward performance and AI-led planning

Amazon Ads launches open beta of MCP Server to connect AI agents with ad workflows