Australia’s Under-16s Social Media Ban at the Senate Inquiry
Australia’s cross-party inquiry into the Internet Search Engine Services Online Safety Code—informally known as the under-16s social media ban—delivered a tense exchange as executives from Meta, TikTok and Snapchat faced questions about how young Australians would navigate the ban. The hearing, led by the Environment and Communications References Committee, sought clarity on parental controls, data protection, and the practical steps platforms will take when the ban takes effect.
Key moments and tensions
The hearing featured sharp exchanges, including a moment when Nationals senator Ross Cadell pressed executives on whether the platforms had developed a “dob your kid in” tool for parents who cannot shut down their children’s accounts. Greens senator David Shoebridge publicly questioned the seriousness of the proposition, prompting intervention from the committee chair, Senator Sarah Hanson-Young, to keep the discussion focused on witnesses. The back-and-forth underscored the political sensitivity surrounding online safety, age verification, and platform responsibilities.
TikTok’s conduct under scrutiny
Senator Cadell accused TikTok of bullying and intimidation after an alleged contact with his office during a prior hearing. TikTok public policy lead Ella Woods-Joyce said she was unaware of the specifics but stressed professionalism within the team and confidence in its conduct. Cadell also pressed the platform on its push to widen under-16 restrictions to include alternate channels such as YouTube, describing TikTok as a “bullying behemoth” in the lobbying arena. The exchange highlighted ongoing concerns about how platforms influence policy through lobbying while balancing user safety obligations.
How the platforms will implement the ban
During the proceedings, details emerged about how teenage accounts would be managed as the ban rolls out. Meta, TikTok and Snapchat indicated they would begin by freezing accounts for users aged 15 and under starting December 10, in line with Australia’s social media safety framework. The goal is to prevent underage access while preserving user memories and data where possible, with several planned options for temporary interruption or deactivation.
TikTok and Snapchat: preserving memories
TikTok outlined options such as deactivating or suspending accounts, or deleting them while offering an archive of content. The company emphasized clear communication with users about what to expect and when. Snapchat described a proactive data preservation approach via a tool called Download My Data, which would secure photos and messages before accounts are disabled, with a verification process for users who are older than 16. This privacy-protective verification would still require some form of age assurance, per the company’s policy position.
Meta’s approach: pause, deactivate, and delete
Meta representatives explained that the company would provide choices between deactivation and temporary pauses for young users’ accounts. Mia Garlick, Meta ANZ director of policy, noted ongoing work to finalize wording and flows while ensuring that users and parents receive clear notice ahead of any action. The plan also includes using a third-party age verification method, via provider Yoti, that may involve video self-checks or government ID, depending on what best balances privacy with safety.
Implications for families and the online safety framework
With the government repositioning the policy in response to industry lobbying, the inquiry is not just about enforcement but also about practical implementation. The committee’s goal is to clarify how teens can safeguard their digital memories, how and when accounts will be paused or deleted, and what protections exist to ensure transparency and user consent. As the December rollout approaches, parents, teens, and educators will be watching closely to see whether the tools offered by Meta, TikTok and Snapchat can deliver the safety promised by the legislation while maintaining reasonable privacy protections.
What’s next
Ministerial and parliamentary channels are expected to continue shaping the details of the online safety code, including how age-verification technologies will operate across platforms and what constitutes compliant behavior under the new rules. The Senate inquiry will likely hold further sessions to probe any gaps in enforcement or user experience, balancing the imperative of child safety with the realities of how teens use social media today.
