Ways to Get Involved

Keeping Young People Safe Online: For Teachers

Tuesday, 30 Sep 2025

At Life Ed, we support the social media delay introduced by the Australian Government, which comes into effect on December 10, 2025.

Download Info Sheet (PDF)

As with all aspects of our work, we approach this new legislation with a strengths-based lens—focusing on empowerment, education, and positive engagement. While this topic may be front of mind for many of the young people we meet in our day-to-day sessions, it’s important to remember that under current legislation, none of them are legally old enough to access social media platforms. As such, the immediate impact on them will be minimal.

Our primary role is to guide children through the online world by explicitly teaching them strategies to keep themselves safe in digital environments. This includes helping them understand online boundaries, recognize unsafe situations, and make informed choices. Equally important is building their help-seeking skills in a supportive, non-judgemental space—encouraging them to identify trusted adults, ask for help when needed, and know where to turn if something doesn’t feel right.

To empower you to support young people, we encourage the use of the tips provided in the FAQ information sheet to help navigate any questions that may arise during lessons.

Please avoid using the word ‘ban’ when discussing the new legislation. Instead, use the term ‘delay’, which is less confrontational and promotes more constructive, open conversations with students.

We also encourage teachers and educators to sign up to join the eSafety's 30-minute webinar for an overview of the upcoming social media age restrictions in Australia.
Share
Tags
FAQs
The eSafety Commissioner is collaborating with platforms that have large numbers of Australian children and features that pose risks. This early engagement is aimed at helping platforms prepare for upcoming age restriction enforcement.
eSafety will use its regulatory powers under the Online Safety Act to:

о Monitor platform compliance
о Enforce penalties
о Ensure platforms are accountable for child safety
о Major Penalties for Non-Compliance will be enforced if;

– Platforms fail to take reasonable steps to prevent underage users from creating accounts and these platforms can face civil penalties.
– Corporations may be fined up to 150,000 penalty units, which currently equates to $49.5 million AUD.
No. Social media platforms subject to age restrictions will be required to take reasonable steps to identify and deactivate accounts held by users under the age of 16.

The term reasonable steps refers to actions that are fair, proportionate, and appropriate to the context. To support this, eSafety has developed regulatory guidelines outlining safe and supportive approaches for account deactivation. These guidelines are grounded in a robust evidence base, including insights from the Australian Government’s Age Assurance Technology Trial and feedback gathered through stakeholder consultations.

Privacy considerations will be guided by the Office of the Australian Information Commissioner, ensuring that enforcement measures align with established privacy standards.
Social media platforms that restrict access based on age (typically 16+) must now:

• Identify and deactivate underage accounts Platforms must proactively search for accounts held by users under 16 and shut them down.
• Block new underage sign-ups They must implement robust age verification systems to prevent under-16s from creating accounts.
• Prevent circumvention tactics This includes stopping users from using fake birthdates, VPNs, or other workarounds to bypass age checks.
• Ensure fairness and error correction If someone 16 or older is wrongly flagged and restricted, platforms must have a clear process to restore access.
• Provide reporting and review mechanisms Users should be able to report suspected underage accounts or appeal if they believe they’ve been restricted unfairly.
They’ll still be able to view publicly available material— like YouTube videos or Facebook business pages - without needing an account. Because they can’t log into social media accounts, this will reduce their exposure to addictive design features like algorithmic feeds, notifications, and targeted content.

The government intends to ensure young people can still reach online services that offer mental health support or crisis information, even without logging in.
Social media platforms must verify a users’ age at sign-up and beyond. They can choose from a range of technologies to do this—there’s no one-size-fits-all solution.

The eSafety Commissioner has published a regulatory guidance to help platforms select effective, compliant age assurance methods. This guidance was informed by:

о The Australian Government’s Age Assurance Technology Trial
о Ongoing consultations with stakeholders and platforms
о International best practices
о Privacy advice from the Office of the Australian Information Commissioner

Australians won’t be forced to use government-issued ID or Digital ID to prove their age online. So, this means that platforms must offer reasonable alternatives for age verification.