EU finds Meta violates digital rules by not doing enough to keep children off Instagram and Facebook

EU Finds Meta Violates Digital Rules by Not Doing Enough to Keep Children Off Instagram and Facebook

EU finds Meta violates digital rules – The European Commission has flagged Meta for breaching the Digital Services Act (DSA), accusing the tech giant of insufficient efforts to block children under 13 from using Instagram and Facebook. This preliminary ruling, announced on April 29, highlights concerns about how Meta’s age verification systems fail to effectively enforce age restrictions, leaving younger users exposed to potential online risks. The DSA, a landmark regulation aimed at holding digital platforms accountable for content moderation and user protection, now appears to have been violated by Meta’s practices, according to the Commission.

Age Enforcement Challenges and Scientific Evidence

Meta’s own terms of service specify 13 as the minimum age for both platforms, yet the Commission argues the company’s methods to enforce this rule are flawed. Children can easily bypass the age check by entering incorrect birth dates during registration, and Meta does not have a robust system in place to confirm the authenticity of the information provided. This lack of verification creates a loophole that allows underage users to access the platforms with minimal effort. The Commission further noted that internal data suggests roughly 10-12% of children under 13 are active on Instagram and Facebook, a figure it claims contradicts Meta’s own assessments.

Regulators emphasized that Meta’s failure to address this issue aligns with broader concerns about the impact of social media on young users. The Commission pointed to scientific studies showing that younger children are particularly susceptible to the effects of platforms like Instagram and Facebook, which include exposure to harmful content, cyberbullying, and addictive behavior. By disregarding this evidence, Meta is accused of underestimating the risks it poses to children, even as it continues to operate without significant changes to its current approach.

Meta’s Defense and Future Plans

Meta has contested the Commission’s findings, stating in a written response that it disagrees with the preliminary conclusions. “Instagram and Facebook are designed for users aged 13 and above, and we have measures in place to detect and remove accounts from individuals under that age,” a Meta spokesperson said in a

quote

. The company reiterated its commitment to improving age verification processes, highlighting ongoing investments in technologies to identify and block underage users. However, the spokesperson acknowledged that age determination remains a complex issue across the industry, requiring collaborative solutions.

Meta also hinted at future initiatives, promising to share additional details next week about “new measures rolling out soon.” While the spokesperson did not specify the nature of these updates, they stressed that the company is actively working with the European Commission to address the challenges highlighted in the investigation. This response comes as the EU continues to scrutinize the role of digital platforms in safeguarding children’s online experiences.

Broad Implications for EU Social Media Regulation

The Commission’s findings have sparked discussions about the potential for stricter regulation in the EU. Several member states are considering implementing a blanket ban on social media platforms for children under 15, a move that would require companies like Meta to adopt more rigorous age verification systems. However, the effectiveness of such measures hinges on the ability to accurately determine a user’s age, a task that Meta’s current systems struggle with.

European Commission President Ursula von der Leyen has been vocal about the need for stronger protections for children online. In April, she told social media platforms that “there are no more excuses” for not implementing measures to safeguard young users. At the time, she announced that the EU’s own age-verification app is technically ready for deployment, though no specific timeline was given for its rollout. This app is intended to provide a more reliable method for confirming users’ ages, potentially reducing the number of underage accounts on platforms like Instagram and Facebook.

Regulators are now urging Meta to revise its risk assessment methodology and significantly enhance its efforts to prevent, detect, and remove underage users. The Commission has granted Meta the right to review the investigation files and submit a written rebuttal. If the findings are confirmed, the EU could issue a formal non-compliance decision, accompanied by a fine of up to 6% of Meta’s global annual revenue. This penalty could amount to billions of euros, depending on the company’s financial performance.

The Road Ahead for Meta and the EU

The ongoing dispute underscores the tension between tech companies and regulators in balancing innovation with user safety. While Meta maintains that it has taken steps to protect children, the Commission’s report suggests these measures are inadequate. The situation also reflects a growing consensus among EU policymakers that age verification is a critical component of digital platform accountability.

As the EU moves forward with enforcement actions, the case against Meta serves as a cautionary example for other tech giants. The Commission’s ability to impose fines under the DSA signals a shift toward more proactive oversight of digital services, particularly those with a significant presence among younger users. This development may also accelerate the adoption of standardized age-verification tools across the region, ensuring a more uniform approach to protecting children’s online rights.

Despite the criticism, Meta remains optimistic about its ability to adapt. The spokesperson noted that the company is continuously refining its strategies to align with EU regulations, emphasizing that the issue of age verification is a shared challenge rather than a singular company’s failure. This perspective may influence future negotiations between the Commission and Meta, as well as shape the broader regulatory landscape for digital platforms in the EU.

The case has also reignited debates about the role of social media in children’s lives. Advocacy groups and parents are calling for stricter limits on screen time and content exposure, while tech companies defend their platforms as essential tools for communication and creativity. As the EU prepares to finalize its approach, the outcome of this investigation could set a precedent for how digital services are held accountable for their impact on young users.

Mark Smith

Mark Smith is an endpoint security specialist with deep knowledge of malware analysis, ransomware defense, and antivirus technologies. He has analyzed various attack vectors affecting Windows, Linux, and cloud endpoints. On CyberSecArmor, Mark publishes technical breakdowns of malware trends, endpoint detection and response (EDR), and proactive defense mechanisms.

62 article(s) published