Meta, the parent company of Facebook and Instagram, is under the microscope of the European Union. The EU is probing into possible breaches of its stringent online content laws, specifically around the safety of young users.
The European Commission, the executive arm of the EU, announced this significant investigation on Thursday. Their concerns? That Meta’s platforms might be fueling behavioral addictions in children and leading them down dangerous “rabbit holes.” Adding to the worry are questions about how well Meta verifies the ages of its users and the privacy risks tied to its recommendation algorithms.
Meta isn’t taking these accusations lightly. In a statement, a spokesperson emphasized the company’s commitment to youth safety. “We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them,” the spokesperson said, pointing out that this is an industry-wide challenge.
This investigation follows a preliminary risk assessment report submitted by Meta in September 2023. Thierry Breton, the EU’s commissioner for the internal market, voiced his concerns, saying the regulator is “not convinced [that Meta] has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms.”
The EU’s deep dive into Meta’s child protection measures is a priority. This means gathering more evidence through information requests, interviews, or inspections. The bloc can take further enforcement actions if necessary, including interim measures and noncompliance decisions. They might also consider commitments from Meta to address these issues.
This scrutiny isn’t new for Meta and other tech giants from the U.S., who have been in the EU’s spotlight since the Digital Services Act (DSA) came into play. The DSA is a landmark law aimed at tackling harmful content online, and companies found violating it can face fines up to 6% of their global annual revenues. Although no fines have been issued yet under this new law, the EU’s actions signal that they are serious about enforcement.
Meta is already entangled in multiple investigations. In December 2023, the EU initiated proceedings against X for failing to combat disinformation. Meta is also being scrutinized for its handling of election-related disinformation, with a probe launched in April over concerns about its efforts to fight disinformation ahead of the European Parliament elections.
Child safety issues are also drawing attention across the Atlantic. In the U.S., the attorney general of New Mexico is suing Meta over allegations that Facebook and Instagram facilitated child sexual abuse, solicitation, and trafficking. Meta has defended itself, claiming it uses “sophisticated technology” to root out predators and takes preventive measures seriously.
The EU’s investigation into Meta underscores the growing pressure on tech giants to ensure the safety of young users and comply with new, stringent regulations. As this story unfolds, it’s clear that the digital landscape is facing a significant shake-up, with far-reaching implications for the industry.
(Source: BBC | Tech Crunch)