Meta faces EU investigation into child safety risks
By Foo Yun Chee
BRUSSELS – Meta Platforms’ Facebook and Instagram social media pages will be investigated for possible breaches of EU online content rules relating to child safety, EU regulators said on Thursday , a move that could result in hefty fines.
Tech companies are required to do more to tackle illegal and harmful content on their platforms under the European Union’s landmark Digital Services Act, which comes into force on last year.
The European Commission said it had decided to open an in-depth investigation into Facebook and Instagram due to concerns they had not adequately addressed risks to children. Meta submitted its risk assessment report in September.
“The Commission is concerned that the systems of both Facebook and Instagram, including their algorithms, could stimulate behavioral addictions in children, as well as create a so-called ‘rabbit hole effect,’” the EU executive said in a statement.
“In addition, the Commission is concerned about the age assurance and verification methods offered by Meta.” Regulators’ concerns relate to children accessing inappropriate content.
Meta said it already has a number of online tools available to protect children.
“We want young people to have safe, age-appropriate online experiences and have spent a decade developing more than 50 tools and policies designed to protect them,” said a Meta spokesperson. protect them”.
“This is a challenge facing the entire industry and we look forward to sharing details of our work with the European Commission.”
Meta has been on the EU’s radar for election disinformation, a key concern ahead of next month’s crucial European Parliament elections. Violating the DSA can result in fines of up to 6% of a company’s annual global turnover.
This article was generated from an automated news agency feed without modifications to the text.