close
close

Association-anemone

Bite-sized brilliance in every update

Meta, TikTok and X have a one-year deadline to prepare for Australia’s social media ban on under-16s
asane

Meta, TikTok and X have a one-year deadline to prepare for Australia’s social media ban on under-16s

In a bold move to protect children online, Australia is poised to introduce unprecedented legislation that will ban children under 16 from using social media. Platforms such as Meta (which owns Facebook and Instagram), TikTok and X will be given a year to implement these restrictions, a measure aimed at reducing exposure to online harm for minors.

Australia’s flagship legislation for online safety

From November 18, the Australian government plans to introduce this landmark legislation, banning access to social media for users under the age of 16, even with parental consent. Conformable The Daily MailPrime Minister Anthony Albanese has strongly backed the policy, saying it empowers parents to protect their children from the potentially harmful effects of social media.

“We prohibit alcohol for under 18s to purchase,” explained Albanese. “This weekend, there will be instances of someone under 18 having access to alcohol. It doesn’t mean we say, “Well, it’s too hard, let it break.” Albanese pointed out that although the law cannot prevent every child. from using social media, aims to set a precedent for responsible online engagement.

Holding the tech giants accountable

Under this legislation, the responsibility of enforcing the age restriction will fall squarely on the tech giants, rather than parents or children. Meta, TikTok and X will have one year from the law’s entry into force to develop mechanisms to verify the age of users. Currently, the minimum age required for these platforms is 13, but the Australian government is pushing to raise this threshold.

Meta expressed concern about the technical challenges of enforcing such a ban. Antigone Davis, Meta’s global head of security, highlighted the limitations of current age verification technology, noting that many age verification tools rely on collecting personally identifiable information, potentially through facial recognition or ID verification, which raises privacy concerns . Conformable wealthsaid Davis, “The idea that industry can simply implement these requirements is probably a misunderstanding of our current technological capabilities.”

The Dark Side of Social Media: A Growing Crisis

The call for stricter regulation follows a series of tragic incidents linked to social media. Conformable KidsspotElla Catley-Crawford, 12, from Brisbane, recently took her own life after enduring relentless cyberbullying. Phished and harassed by classmates who spread her private photos online, Ella’s experience is just one of many stories highlighting the dark side of social media.

In May, another incident involving a social media challenge claimed the life of 13-year-old Esra Haynes, who died after taking part in the “chroming” trend, which involves inhaling chemicals. Esra’s parents have since campaigned for stricter regulations for social media, arguing that children often lack the maturity to manage online content responsibly. Her father stated: “Kids at 13 don’t fully understand the consequences. Social media exposes them to risks they are not equipped to navigate.”

Australia Joins Global Push for Youth Online Safety

Australia’s legislation is part of a wider global movement to regulate children’s use of social media. In China, Regulations on the protection of minors in cyberspace impose strict controls on harmful content and cyberbullying, while requiring technology companies to rigorously verify the age of users. This legislation includes provisions to limit minors’ online activity and prevent exposure to addictive content, similar in intent to the planned restrictions in Australia.

Countries such as Japan have already introduced regulations designed to protect young users. In September, Instagram introduced restrictions in Japan that limit messaging capabilities and app usage time for users between the ages of 13 and 17. wealththese users receive reminders to log out after 60 minutes of use, and parental consent is required for account changes, with parents receiving information about their children’s online activities.

In the UK, Labor MP Josh MacAlister recently proposed legislation to raise the age of “Internet adulthood” from 13 to 16, requiring parental consent for under-16s to access social media. MacAlister advocated for similar restrictions in schools to limit smartphone use. arguing that reducing screen time can mitigate negative effects on mental health.

A challenging path forward for technology firms

Despite concerns about the enforcement of these measures, Meta has committed to complying with Australia’s age restrictions, although it questions whether current technology can meet the requirements effectively. The company pointed out that facial recognition or identity-based age verification could be invasive and difficult to implement. As Davis noted, “Age verification technology … often requires personally identifiable information, which raises questions about user privacy and data manipulation.”

The legislation is expected to bring social media companies into line with Australia’s goal of child safety. However, as seen in other parts of the world, companies can face substantial logistical and ethical hurdles in enforcing these rules.