Australia prohibits children under 16 years old from using social networks. Meta, TikTok is not satisfied
Australia has introduced groundbreaking laws banning children under 16 from accessing social media platforms. The law, recently passed by the country’s National Assembly, aims to address growing concerns about the impact of social media on young people’s mental health, including issues such as cyberbullying , addiction and exposure to harmful content.
What does the new law require from ‘Social Media Platforms’?
The bill passed the Senate on Thursday requires social media companies to strengthen age verification processes to prevent children under 16 from using platforms like Facebook, Instagram, TikTok and Snapchat, according to one report. report by CNN. The law will take effect in early 2025, giving both tech companies and parents time to adjust.
Also read: India’s 10 most powerful CEOs lead Google, Microsoft, Adobe and more
Under the new regulations, social media companies will have one year to implement a robust age verification system. Non-compliance can result in significant fines, which can reach up to A$50 million for repeated violations. These penalties are designed to ensure that platforms take the necessary steps to block access for users under 16 years of age.
Prime Minister Anthony Albanese has expressed strong support for the new law, calling it an important step in protecting children’s mental and emotional health in the digital age. The legislation follows extensive research and recommendations from health experts who have warned about the negative impact of social media on young users. Studies have shown a link between excessive social media use and increased rates of depression, anxiety, and sleep disorders in adolescents.
Also read: iPhone SE 4 is coming soon but may have to share the spotlight with OnePlus 13R
What are the concerns of technology companies?
Despite the support, the law’s swift passage has been met with criticism. The bill was quickly passed by the National Assembly with limited public consultation time. A Senate committee inquiry was launched in just 24 hours, with submissions from more than 100 sources expressing concern about the rushed process. Tech companies, including Meta, TikTok and Snap Inc., have acknowledged the importance of protecting young users but raised concerns about the pace of the law and potential technical challenges.
Also read: Steam Autumn Sale 2024: Huge discounts on famous games like Red Dead Redemption 2, GTA 5, etc.
To comply, platforms are exploring advanced age verification technologies, including facial recognition and digital ID systems. However, the implementation of such methods has raised concerns about privacy and data security. The new regulations put pressure on technology companies to balance user safety and privacy while adapting to legal requirements.
The historic move is expected to set a precedent for other countries grappling with similar issues surrounding social media and child safety.