Platforms on notice to comply with Social Media Minimum Age
eSafety
Media Release
4 November 2025
Platforms on notice to comply with Social Media Minimum Age
eSafety has informed Facebook, Instagram, Snapchat, Threads, TikTok, X, YouTube, Kick and Reddit of its view they are age-restricted platforms required to comply with Social Media Minimum Age restrictions from December 10.
With just over a month to go, eSafety has also informed the broader online industry all platforms are obliged to continually assess whether they meet the definition of an ‘age-restricted social media platform’, in particular when they introduce new features or their primary usage changes.
If they do meet the definition, they must take reasonable steps to ensure users under 16 do not hold an account.
On available information, eSafety considers all nine services named, currently meet the criteria for ‘age-restricted social media platform’, in particular the key requirement that their “sole or a significant purpose is to enable online social interaction”.
eSafety has informed the following platforms it considers they are not subject to age restrictions on the basis they do not currently meet the criteria for ‘age-restricted social media platform’ including falling within an exclusion in the legislative rules: Discord, GitHub, LEGO Play, Roblox, Steam and Steam Chat, Google Classroom, Messenger, WhatsApp and YouTube Kids.
From 10 December, eSafety expects all nine services currently assessed as meeting the criteria for age restriction to take reasonable steps to prevent Australian children under the age of 16 from having accounts. eSafety remains in ongoing discussions with these services around their compliance obligations and our planned approach towards enforcement.
Any age restricted platforms which fail to do so may face enforcement action, including civil penalties of up to $49.5 million.
Due to the fast-changing nature of technology, eSafety has been clear there will not be a static list of companies that are age-restricted.
Instead, eSafety will provide updated advice to the public on current assessments and its approach to compliance and enforcement on its website. When new platforms emerge or existing ones evolve to the extent that their purposes change, eSafety may reassess those services. eSafety will assess (and reassess) services when considering exercising its powers.
“Delaying children’s access to social media accounts gives them valuable time to learn and grow, free of the powerful, unseen forces of harmful and deceptive design features such as opaque algorithms and endless scroll. This important normative change will be invaluable to parents and young people alike – creating friction or a check in the online ecosystem that previously did not exist,” eSafety Commissioner Julie Inman Grant said.
“But I’ve also said consistently that age restricting social media is one important tool in our holistic approach to online safety. Ultimately, all online platforms should be building less harmful, age-appropriate experiences through safety by design. Where they are not, we will apply the Online Safety Act’s mandatory codes and standards and supplement with robust prevention and education resources for Australians.
“Of course, our complaints schemes which provide rapid remediation of harm around cyberbullying or image-based abuse, including deepfake abuse, will continue to provide in-time support for children in the grip of online crises,” Ms Inman Grant said.
“We will continue to take a whole of ecosystem approach, but we want to reinforce that just because a service is excluded, it does not mean it is absolutely safe. As parents, we will need to continue being engaged in our children’s online lives.”
To help Australians understand better and prepare for the changes that are coming, eSafety recently released a comprehensive package of resources and webinars to answer questions from the public and provide additional details.
“We recognise this transition will be significant, especially for some young people,” Ms Inman Grant said.
“We now have a range of tailored resources for young people to help them adjust to the changes and find alternative ways to connect with their communities.”
Informed by extensive consultation and feedback from key partners including mental health and support organisations, such as headspace, Kids Helpline, Beyond Blue, Raising Children and ReachOut Australia, all resources are available for free at eSafety.gov.au.
The package includes:
- A dedicated online hub with tailored FAQs explaining what is happening, and how to prepare.
- Practical guidance for parents and carers, including conversations starters and get-ready guides.
- Information for educators, explaining what the new restrictions mean for schools, and how to prepare students.
- Youth-friendly content outlining what the new restrictions mean for young people,
- downloadable action plans and where to go for help and support.
“I strongly encourage parents, educators and young people to visit eSafety.gov.au, download our resources and register for a live webinar where we will explain the social media age restrictions and answer questions in sessions tailored for parents, carers, educators and youth serving professionals,” Ms Inman Grant said.
“Our conversation starters, classroom resources and step-by-step guides are all designed to support parents, carers and educators to reset family digital rules ensure this delay can be used to develop critical thinking and emotional resilience before having a social media account. More resources targeting at-risk young people and communities are currently in development and will follow soon.”