Australia prohibits social media for children

As of Wednesday, users under 16 will be unable to create or maintain such accounts

Australia is about to become the world’s first country to prohibit social media for children under 16, barring them from platforms like TikTok, YouTube, Instagram, and Facebook.

Approved by Parliament last year, the ban is set to come into force on Wednesday. Companies that fail to comply may face penalties of up to $33 million.

“From 10 December 2025, age-restricted social media platforms will need to take reasonable measures to prevent Australians under the age of 16 from creating or retaining an account,” the government stated, referring to the measure as a means to safeguard children “at a crucial stage of their development.”

Platforms will be obliged to use a combination of signals, such as account activity, viewing habits, and user photos, to identify underage users. They must also prevent minors from evading age limits by using fake IDs, AI-generated images, deepfakes, or VPNs.

Tech companies have criticized the ban, characterizing it as “vague,” “problematic,” and “rushed.” TikTok and Meta said the law would be hard to enforce but promised to comply. Meta has already started removing under-16 accounts ahead of the December 10 deadline. Snapchat and other platforms warned that the measure could drive young people towards “darker parts of the internet.” Reddit has also strongly criticized the law, calling it “legally incorrect” and “arbitrary.”

Other countries are also looking into similar legislation with the stated aim of protecting children.

The European Parliament adopted a non-binding resolution in November calling for a minimum age of 16 on social media to ensure “age-appropriate online interaction.” Denmark has proposed banning users under 15, while France, Spain, Italy, Denmark, and Greece are jointly testing an age-verification app. Malaysia has announced plans to ban social-media use for under-16s starting in 2026.

Last week, Russia banned Roblox, an online gaming platform mainly targeted at children, due to what it described as the distribution of extremist content and LGBTQ propaganda.

Concerns about child safety online have led to increasing legal pressure. Meta is facing lawsuits in the US alleging that it allowed illegal content to remain on its platforms despite repeated violations, including adult strangers contacting minors, suicide, eating disorders, and child sexual abuse.