Meta Nukes 543,000 Australian Accounts to Avoid Systemic Non-Compliance Fines
Meta has purged approximately 543,000 Australian accounts to dodge potential fines of up to $49.5 million AUD, even as the tech giant prepares for a protracted legal battle over the nation’s under-16 social media ban. The mass deactivations, executed between December 4 and December 11, 2025, represent the first aggressive enforcement phase of the Online Safety Amendment (Social Media Minimum Age) Act 2025.
The data covers three major platforms: 330,000 Instagram accounts, 173,000 Facebook accounts, and 40,000 profiles on Threads. These accounts were flagged and closed because internal systems identified the users as likely being under the age of 16. Australia’s mandate requires platforms to implement rigid age-gating or face penalties for systemic non-compliance, a move that has forced an immediate overhaul of Meta’s verification infrastructure.
Algorithmic Policing and Facial Analysis
To satisfy the "reasonable steps" requirement of the Act, Meta is moving beyond simple date-of-birth self-reporting. The company’s compliance strategy now relies on a high-friction verification stack:
-
Behavioral Age Inference: Meta’s AI scans for specific keywords in comments (such as "Happy 13th Birthday"), monitors friend networks to identify clusters of underage users, and analyzes behavioral signals including typing rhythm, scrolling velocity, and the use of "brain-rot" slang.
-
Biometric Verification: Users suspected of being underage are forced to provide video selfies, which are processed via facial-analysis software to estimate age.
-
Proactive Detection Spikes: Meta reported that these automated identification protocols have increased the rate of account closures by 300% since the legislation was finalized.
While Meta has reached this initial compliance threshold, the company maintains that these methods create a significant privacy trade-off, as they require deeper data harvesting of both adults and minors.
Legal Pushback and Market Fallout
The mass closures haven't silenced Meta’s opposition. The company claims the ban risks driving younger users toward "unregulated digital silos" where safety tools and moderation are non-existent. Other major players are following a similar trajectory of defensive compliance paired with legal resistance.
Reddit has initiated a lawsuit against the Australian government, arguing its platform should be exempt from the social media classification. Meanwhile, other platforms are conducting their own purges to avoid the $49.5 million AUD ($33 million USD) per-violation fines; Snapchat has deactivated 150,000 Australian accounts, and TikTok has closed over 200,000.
The Compliance Audit
The Australian eSafety Commissioner began the first formal compliance audits last week. The regulator is currently investigating whether Meta’s "reasonable steps" are consistent across its different apps or if the December purge was a temporary measure to appease officials.
The transition has triggered immediate friction for the user base. Meta is currently reviewing over 10,000 reactivation appeals from users who claim they were incorrectly flagged. Only 20% of these appeals have resulted in account restoration, as most users fail to provide the government-issued ID or biometric proof required to override the AI’s initial assessment.
As the Australian experiment continues, the UK and EU are monitoring the effectiveness of these biometric and behavioral hurdles. For now, the Australian market serves as the global test bed for whether a government can successfully decoupled an entire generation from the mainstream social internet.
