Australia’s New Under-16 Social Media Restrictions: Legal Framework, Practical Impact, and Enforceability Challenges
- Katie O'Brien
- 3 days ago
- 4 min read
Australia is preparing to roll out one of the world’s strictest social media age-restriction regimes. From 10 December 2025, major platforms such as Facebook, Instagram, X (formerly Twitter), YouTube, and Snapchat will be legally required to prevent users under 16 from maintaining accounts. Among these platforms, Meta has begun a nationwide notification campaign to comply with the upcoming laws, prompting widespread discussion about the scope, legality and practicality of the reforms.
Below, we break down the background to these reforms, the legal framework behind them, how platforms are responding, and the enforceability challenges that lie ahead.
Background to the Reforms
These new laws stem from long-running public and political concern about the harms associated with children’s social media use. The debate centres around the exposure to inappropriate content, online grooming, bullying, algorithmic risks, and the psychological effects of social comparison. In early 2025, the Online Safety Amendment (Social Media Minimum Age) reforms came into force, marking a bold move to limit underage social media access.
Legal Framework: What the Law Requires
Under the new Online Safety Amendment (Social Media Minimum Age) laws, “age-restricted social media platforms” must take “reasonable steps” to ensure that users 15 years and under cannot hold accounts. The legislation adopts a flexible compliance standard and does not prescribe specific technologies, such as ID verification or biometrics. Instead, it places the onus squarely on platforms to justify how their chosen methods constitute “reasonable steps” for age verification. This approach shifts the entire burden onto platforms, leaving major services including Facebook, Instagram, TikTok, Snapchat, Reddit, and X exposed to significant interpretive and regulatory risk.
Practical Changes Being Rolled Out by Platforms
To comply with the new requirements, Meta has begun large-scale operational changes in Australia. From 19 November 2025, it started notifying users believed to be under 16 that their accounts will be deactivated, with full deactivation scheduled for 10 December 2025, just before the law takes effect. Affected users can download their photos, messages, and posts, and regain access once they turn 16. Meta’s approach combines several technologies: AI-powered facial age estimation via third-party provider Yoti, AI-driven age inference based on users’ online activity, and a government ID upload process allowing appeals for suspected misclassifications.
However, X (formerly Twitter) has taken a markedly different stance. The company has formally requested an exemption from the policy and has not publicly committed to any specific measures for verifying users’ ages or removing accounts for under‑16s. As of early December 2025, X remains the only major platform without a transparent strategy, creating uncertainty about how it will meet its responsibilities under the new social media ban.
Enforceability Issues and Legal Challenges Ahead
While the legislative aim is clear, the practical enforceability of an age-16 minimum is legally fraught. Several challenges are emerging, and platforms face immense difficulties in balancing their users satisfaction and legal compliance.
Accuracy and Reliability of Age Verification:
Age-verification technology is far from perfect. It is common for facial age-estimation tools to be biased or inaccurate, and ID-based checks raise privacy concerns that may discourage families from using them. In addition, tech-savvy teenagers can often bypass these measures by using false birthdates or VPNs.
Defining “Reasonable Steps”:
Given the legislation is technology-neutral, it is ultimately up to eSafety regulators to decide whether platforms actions meet the required standard of “reasonable steps.” This creates real uncertainty, as measures that are too strict may be seen as disproportionate or intrusive, while measures that are too lenient may expose Meta to regulatory scrutiny or penalties.
Privacy and Data Protection Risks:
The mechanisms being imposed by Meta and other platforms will require biometric data, identity documents, and other sensitive information about minors. If this information is mishandled, platforms risk legal consequences, privacy-law breaches, and significant reputational damage.
Avoidance and Workarounds:
It is unavoidable that teenagers will seek to circumvent age-verification measures by creating accounts using false birthdates, accessing services through parental accounts, or migrating to unregulated or fringe platforms. Such evasive behaviours raise important questions for regulators, as they may materially undermine the inherent purpose and effectiveness of statutory reforms.
Conclusion
Australia’s under-16 social media reforms represent one of the most ambitious attempts globally to regulate children’s access to digital platforms. Meta’s early shutdowns show that the practical impacts are already real, but the legal and enforcement challenges are only beginning. However, X’s reluctance to comply with the new changes highlights potential enforcement challenges and the likelihood of ongoing disputes between regulators and platforms. Businesses, policymakers and families will all be watching closely to see whether the reforms deliver meaningful protection, or whether they simply push young users to less regulated digital spaces.
At BlackBay Lawyers, we are experts in helping clients navigate complex compliance frameworks in the social media space, including developing and reviewing social media policies, ensuring regulatory obligations are met, and providing strategic guidance on emerging digital risks.
If you need assistance navigating these requirements, please contact our team for expert guidance and support.
The content in this Article is intended only to provide a summary and general overview on matters of interest. It is not intended to be comprehensive nor does it constitute legal advice. It should not be relied upon as such. You should seek legal or other professional advice before acting or relying on any of the content.
ABOUT THE AUTHOR
Katie O'Brien is an enthusiastic and driven member of our team. With a genuine focus on understanding each client’s unique needs, she uses her legal expertise to craft tailored strategies that deliver exceptional results.
Katie focuses on assisting clients in areas of defamation, social media, creative industries, employment law, and commercial litigation. Holding a Bachelor of Laws and a Bachelor of Media (Public Relations and Social Media) from Macquarie University, Katie applies her legal knowledge and strong understand of the digital world to expertly navigate the unique challenges arising in these sectors.


