Keeping Young People Safe Online: For Teachers
As with all aspects of our work, we approach this new legislation with a strengths-based lens—focusing on empowerment, education, and positive engagement. While this topic may be front of mind for many of the young people we meet in our day-to-day sessions, it’s important to remember that under current legislation, none of them are legally old enough to access social media platforms. As such, the immediate impact on them will be minimal.
Our primary role is to guide children through the online world by explicitly teaching them strategies to keep themselves safe in digital environments. This includes helping them understand online boundaries, recognize unsafe situations, and make informed choices. Equally important is building their help-seeking skills in a supportive, non-judgemental space—encouraging them to identify trusted adults, ask for help when needed, and know where to turn if something doesn’t feel right.
To empower you to support young people, we encourage the use of the tips provided in the FAQ information sheet to help navigate any questions that may arise during lessons.
Please avoid using the word ‘ban’ when discussing the new legislation. Instead, use the term ‘delay’, which is less confrontational and promotes more constructive, open conversations with students.
о Monitor platform compliance
о Enforce penalties
о Ensure platforms are accountable for child safety
о Major Penalties for Non-Compliance will be enforced if;
– Platforms fail to take reasonable steps to prevent underage users from creating accounts and these platforms can face civil penalties.
– Corporations may be fined up to 150,000 penalty units, which currently equates to $49.5 million AUD.
The term reasonable steps refers to actions that are fair, proportionate, and appropriate to the context. To support this, eSafety has developed regulatory guidelines outlining safe and supportive approaches for account deactivation. These guidelines are grounded in a robust evidence base, including insights from the Australian Government’s Age Assurance Technology Trial and feedback gathered through stakeholder consultations.
Privacy considerations will be guided by the Office of the Australian Information Commissioner, ensuring that enforcement measures align with established privacy standards.
• Identify and deactivate underage accounts Platforms must proactively search for accounts held by users under 16 and shut them down.
• Block new underage sign-ups They must implement robust age verification systems to prevent under-16s from creating accounts.
• Prevent circumvention tactics This includes stopping users from using fake birthdates, VPNs, or other workarounds to bypass age checks.
• Ensure fairness and error correction If someone 16 or older is wrongly flagged and restricted, platforms must have a clear process to restore access.
• Provide reporting and review mechanisms Users should be able to report suspected underage accounts or appeal if they believe they’ve been restricted unfairly.
The government intends to ensure young people can still reach online services that offer mental health support or crisis information, even without logging in.
The eSafety Commissioner has published a regulatory guidance to help platforms select effective, compliant age assurance methods. This guidance was informed by:
о The Australian Government’s Age Assurance Technology Trial
о Ongoing consultations with stakeholders and platforms
о International best practices
о Privacy advice from the Office of the Australian Information Commissioner
Australians won’t be forced to use government-issued ID or Digital ID to prove their age online. So, this means that platforms must offer reasonable alternatives for age verification.