Technology

Meta slapped with child safety probe under sweeping EU tech law

Published

on

Mark Zuckerberg, CEO of Meta testifies before the Senate Judiciary Committee at the Dirksen Senate Office Building on January 31, 2024 in Washington, DC.

Alex Wong | Getty Images

Facebook parent company Meta on Thursday was hit with a major investigation from the European Union into alleged breaches of the bloc’s strict online content law over child safety risks.

The European Commission, the EU’s executive body, said in a statement that it is investigating whether the social media giant’s Facebook and Instagram platforms “may stimulate behavioural addictions in children, as well as create so-called ‘rabbit-hole effects’.”

The Commission added that it is concerned about age verifications on Meta’s platforms, as well as privacy risks linked to the company’s recommendation algorithms.

“We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them,” a Meta spokesperson told CNBC by email.

“This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission.”

The Commission said that its decision to initiate an investigation comes of the back of a preliminary analysis of risk assessment report provided by Meta in September 2023.

Thierry Breton, the EU’s commissioner for internal market, said in a statement that the regulator is “not convinced [that Meta] has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms.”

The EU said it will carry out an in-depth investigation into Meta’s child protection measures “as a matter of priority.” The bloc can continue to gather evidence via requests for information, interviews, or inspections.

The initiation of a DSA probe allows the EU to take further enforcement steps, including interim measures and non-compliance decisions, the Commission said. The Commission added it can also consider commitments made by Meta to remedy its concerns.

Meta and fellow U.S. tech giants have been increasingly finding themselves in the spotlight of EU scrutiny since the introduction of the bloc’s landmark Digital Services Act, a ground-breaking law from the European Commission seeking to tackle harmful content.

Under the EU’s DSA, companies can be fined up to 6% of their global annual revenues for violations. The bloc is yet to issue fines to any tech giants under its new law.

In December 2023, the EU opened infringement proceedings into X, the company previously known as Twitter, over suspected failure to combat content disinformation and manipulation.

The Commission is also investigating Meta over alleged infringements of the DSA related to its handling of election disinformation.

In April, the bloc launched a probe into the firm and said it’s concerned Meta hasn’t done enough to combat disinformation ahead of upcoming European Parliament elections.

The EU is not the only authority taking action against Meta over child safety concerns.

In the U.S., the attorney general of New Mexico is suing the firm over allegations that Facebook and Instagram enabled child sexual abuse, solicitation, and trafficking.

A Meta spokesperson at the time said that the company deploys “sophisticated technology” and takes other preventive steps to root out predators.

Trending

Exit mobile version