Overview of the Senate Inquiry
The Environment and Communications References Committee in Australia continued its examination of the Internet Search Engine Services Online Safety Code, commonly referred to as the social media ban. The hearing focused on how Meta, TikTok and Snapchat plan to implement protections for users under 16 and how the platforms will manage accounts once the ban takes full effect.
Chairing the session, senators pressed executives about tools available to parents to flag or control their children’s online presence, and whether any “dob in” style options exist when parental oversight is lacking. The exchange underscored the tension between protecting young users and ensuring family privacy and user agency.
Key Moments in the Debate
Nationals senator Ross Cadell led questions about whether platforms had developed mechanisms for parental reporting, only to be met with a sharp interjection from Greens senator David Shoebridge. The back-and-forth highlighted frictions within the chamber over how aggressively social media firms should police under-16 use while balancing civil liberties and practical enforcement.
The committee’s chair, Senator Sarah Hanson-Young, interrupted the clash to remind members that questions were directed at witnesses, not fellow senators, emphasizing procedural norms in the inquiry.
TikTok’s Conduct and Lobbying Concerns
Senator Cadell accused TikTok of bullying and intimidation, recounting an alleged incident in which a staff member reportedly suggested favorable alignment with the leader’s office and the opposition. Ella Woods-Joyce, TikTok’s content and safety policy lead, said she was not aware of the specifics but stressed the company’s commitment to professional conduct and adherence to the law.
The inquiry also revisited TikTok’s lobbying activity around changes to the blanket ban to include YouTube, a move that followed discussions with senior TikTok executives. Minister Anika Wells previously reversed a pre-election pledge to exclude YouTube after lobbying from TikTok, signaling the complex interplay between regulation and corporate strategy.
How the Ban Will Affect Teen Accounts
Witnesses from Meta, TikTok, and Snapchat outlined the technical steps anticipated once the ban takes effect, which is set to roll out in phases in early December. The platforms explained how users aged 15 and under would see their accounts frozen, with options to preserve memories and protect data while compliance actions proceed.
Specific measures include:
– TikTok allowing deactivation, suspension, or deletion, with an archive option for those choosing deletion.
– Snapchat offering a Download My Data tool to secure photos and messages before accounts are locked, followed by age-verification processes.
– Meta providing the option of deactivation or temporary pauses, with clear notices to users and guardians about available choices.
Age Verification and Privacy Considerations
Across the testimonies, age verification emerged as a critical element. Meta described using third-party Yoti for age checks, offering methods such as video selfies, age estimation, or government ID submission, while emphasizing privacy protections in line with the inquiry’s aims. The discussions acknowledged the delicate balance between enabling legitimate parental oversight and safeguarding users’ privacy and data security.
What This Means for Families and Teen Users
For families, the inquiry highlighted transparent communication about when, how, and what actions are available to younger users and their guardians. Platforms stressed that notices will guide users through the appropriate steps—whether pausing, archiving, or permanently deleting data—so families can respond to evolving safety requirements while preserving important memories.
Looking Ahead
The committee’s questions set the tone for ongoing scrutiny of how online platforms handle under-16 users in Australia. As the December changes approach, technology policy is likely to tighten around age verification, data portability, and user controls. Stakeholders—from lawmakers to platform operators—will continue to debate the best ways to protect young people without stifling legitimate expression online.
