EU Nears Fine for Meta Over Alleged DSA Child Safety Violations on Facebook, Instagram

It appears that Meta should prepare for another violation over child safety concerns.

By

The European Commission is edging toward potential penalties against Meta following preliminary findings that suggest possible violations of the Digital Services Act (DSA) regarding child safety concerns.

The investigation focuses on Facebook and Instagram, with regulators questioning whether the company is doing enough to prevent underage users from accessing its platforms.

EU Grows Alarmed Over Age Verification Gaps

Meta CEO Mark Zuckerberg

The main focus of the investigation is Meta's enforcement of its minimum age requirement of 13 years old. EU regulators say children may still be able to bypass age checks during account creation.

According to The Financial Times, officials also pointed out that reporting underage users is overly complex, making enforcement less effective than required under the DSA.

The Commission further criticized Meta's internal risk assessment practices, describing them as inconsistent and insufficient when evaluating potential harms to minors.

Millions of Underage Users Still Access Platforms

According to EU findings, an estimated 10 to 12 percent of children under 13 in the European Union may still be using Facebook or Instagram.

Regulators emphasized that scientific studies consistently show younger users are more vulnerable to mental health and safety risks linked to social media exposure.

As a result, the EU is pushing Meta to strengthen detection tools, improve verification systems, and more aggressively remove accounts belonging to underage users.

Potential Penalties Under the Digital Services Act

If Meta fails to address the concerns, it could face fines of up to six percent of its global annual revenue under the Digital Services Act framework, according to Engadget. However, the company still has the opportunity to respond to the preliminary findings and implement corrective measures before a final ruling is made.

Meta Defends Its Age Safety Systems

Meta has pushed back against the allegations, stating that Facebook and Instagram are strictly intended for users aged 13 and above.

According to Facebook's parent firm, it already uses automated systems to detect and remove underage accounts and continues to invest in improved enforcement technologies.

Meta also indicated that additional safety features are expected to be introduced in future updates to further strengthen child protection measures.

The investigation, first launched in 2024, shows glimpses of increasing regulatory pressure across Europe to address child safety and social media addiction concerns. It's going to be a long battle that Meta needs to face for many years.

In other news, Meta AI Kids Chats are now visible to parents and guardians. This will help adults check what their children are searching on online platforms.

Originally published on Tech Times

Tags
EU, Meta

© 2026 TECHTIMES.com All rights reserved. Do not reproduce without permission.

Join the Conversation