The European Union has accused Meta Platforms of failing to uphold its legal obligations to protect underage users, announcing preliminary findings that the tech giant is in breach of the bloc's Digital Services Act (DSA). The investigation, which has been ongoing for nearly two years, centers on Meta's social media platforms, Facebook and Instagram, and specifically addresses the company's alleged shortcomings in preventing children under the age of 13 from accessing its services. The European Commission stated that Meta has not implemented effective measures to enforce its own minimum-age rule of 13 for these platforms, as reported by Reuters and The Guardian. This development could lead to substantial fines for Meta, potentially reaching up to 6% of its global annual revenue.
Background & Context
The Digital Services Act (DSA) is a landmark piece of legislation by the European Union designed to create a safer digital space for users by holding online platforms accountable for the content they host and the systems they employ. The act mandates that platforms diligently identify and mitigate risks, particularly those concerning vulnerable users such as minors. This investigation into Meta is part of a broader trend of increased regulatory oversight by the EU on major technology companies, aiming to ensure they comply with stringent standards for user safety, data protection, and content moderation. Several European countries are also considering or implementing stricter social media regulations for younger users, indicating a continent-wide push for enhanced child protection online.
Key Details
According to the European Commission's preliminary findings, children can easily create accounts on Facebook and Instagram by providing a false date of birth, with Meta lacking effective controls to verify their age. The commission also noted that Meta's risk assessment processes inadequately evaluate the potential for minors under 13 to be exposed to age-inappropriate content and experiences on its platforms. Furthermore, the tool Meta provides for reporting underage users was deemed difficult to use and ineffective, allowing underage individuals to continue using the services. Meta has stated that it disagrees with the preliminary findings, asserting that it has measures in place to detect and remove underage users and will be rolling out additional measures soon. The company described age verification as an "industry-wide challenge" requiring an "industry-wide solution."
What This Means
This preliminary ruling signifies a critical juncture in the EU's enforcement of the DSA and Meta's responsibility towards child safety online. For Meta, the potential financial penalties are significant, given its substantial global revenue. Beyond the fines, the ruling could compel Meta to fundamentally re-evaluate and overhual its age verification and content moderation systems across its platforms. For parents and child safety advocates, this represents a validation of concerns about the exposure of young children to potentially harmful online environments. The ruling also sets a precedent for how the DSA will be applied to other major tech platforms operating within the EU, potentially leading to similar investigations and stricter compliance demands across the industry. The human impact is direct, as the measures aim to shield children from content and interactions that could negatively affect their development and well-being.
What to Watch Next
Meta will now have the opportunity to examine the commission's investigation file and present a defense. The European Commission will then issue a final decision. If the preliminary assessment is confirmed, Meta faces the possibility of substantial fines and is likely to face increased pressure to implement more robust and effective child protection measures. Industry observers will be closely watching Meta's response and any new measures it plans to announce, as well as how other social media companies adapt their policies in light of this ongoing regulatory scrutiny. The EU's continued focus on enforcing the DSA suggests that 2026 will be a pivotal year for digital policy and platform accountability in Europe.