Meta faces EU probe over child safety risks on Facebook and Instagram

Meta Platforms’ social media apps Facebook and Instagram are under investigation by EU regulators for potential breaches of online content rules related to child safety. The probe, announced on Thursday, comes as part of the European Union’s efforts to hold tech companies accountable for tackling illegal and harmful content under the Digital Services Act (DSA), implemented last year.

The investigation into Meta’s platforms unlocks additional investigatory powers for EU enforcers, allowing them to conduct office inspections and apply interim measures. If confirmed breaches are found, Meta could face penalties of up to 6% of its global annual turnover.

Facebook and Instagram, designated as very large online platforms (VLOPs) under the DSA, are subject to an extra set of rules that require Meta to assess and mitigate systemic risks, particularly concerning minors’ mental health. The European Commission’s decision to launch an in-depth investigation was prompted by concerns that Meta had not adequately addressed risks to children, despite submitting a risk assessment report in September.

One of the key concerns of the investigation is the effectiveness of age gates and measures to prevent underage users from accessing the platforms. The Commission is particularly focused on assessing the adequacy of Meta’s measures in this regard.

The Commission expressed worries that Facebook and Instagram’s systems, including their algorithms, may stimulate behavioral addictions and create “rabbit-hole effects” for children, leading them to access inappropriate content. Additionally, the regulator raised concerns about Meta’s age-assurance and verification methods.

Written by Maya Robertson


Leave a Reply

Your email address will not be published. Required fields are marked *


The 19 Best Card Games for Android

The 8 Best Offline Music Apps for iPhone