EU Finds Meta and TikTok in Violation of DSA, Risks Huge Fines

The EU Commission preliminarily ruled that Meta's Facebook and Instagram, along with TikTok, violated the Digital Services Act by having ineffective illegal content reporting systems and insufficient data access for researchers. This could lead to fines up to 6% of global revenue, signaling stricter tech oversight in Europe.
EU Finds Meta and TikTok in Violation of DSA, Risks Huge Fines
Written by Emma Rogers

BRUSSELS—In a significant escalation of regulatory scrutiny, the European Commission has preliminarily determined that Meta Platforms Inc. and ByteDance Ltd.’s TikTok violated key provisions of the Digital Services Act (DSA), the bloc’s landmark legislation aimed at curbing online harms. The findings, announced on Friday, highlight alleged shortcomings in how these tech giants handle illegal content reporting and provide data access to researchers, potentially exposing them to fines up to 6% of their global annual revenue.

The Commission’s probe, initiated earlier this year, zeros in on Meta’s Facebook and Instagram platforms, as well as TikTok, accusing them of maintaining “ineffective” systems for users to flag illicit material. Investigators argue that these systems impose unnecessary hurdles, such as complex interfaces or inadequate follow-up mechanisms, which deter users from reporting violations like hate speech or disinformation.

This regulatory hammer comes amid growing pressure on the EU to enforce its tech rulebook rigorously, especially as platforms grapple with a surge in harmful content amplified by algorithms. Industry experts note that the DSA, effective since 2023, mandates very large online platforms (VLOPs) like Meta and TikTok to implement transparent moderation tools, yet compliance has been uneven, with critics pointing to profit-driven designs that prioritize engagement over safety. The Commission’s preliminary view underscores a broader push for accountability, potentially setting precedents for how global tech firms operate in Europe.

Meta, which reported over $134 billion in revenue last year, could face billions in penalties if the charges stick. The company has defended its practices, stating in a response that it invests heavily in content moderation, including AI-driven tools and human reviewers. However, the EU’s assessment, detailed in a statement, criticizes Meta for not granting researchers sufficient access to public data, a requirement under the DSA to enable independent studies on systemic risks like misinformation.

TikTok, owned by China’s ByteDance, faces similar allegations. The platform, popular among younger users, is faulted for opaque advertising systems and inadequate researcher data access, which the Commission says hinders efforts to study content reaching minors. According to reporting from POLITICO, this move reflects mounting EU frustration with tech firms’ self-regulation, especially as elections loom and disinformation threats intensify.

Delving deeper into the investigative framework, the DSA classifies platforms based on user scale, imposing stricter obligations on those with over 45 million monthly active users in the EU. For Meta and TikTok, this means mandatory risk assessments and annual audits, areas where the Commission identifies gaps. Insiders familiar with the proceedings suggest that the focus on data transparency is pivotal, as it empowers academics and watchdogs to scrutinize algorithmic biases without relying on company goodwill, a shift from the pre-DSA era of voluntary disclosures.

The companies now have until November to respond to these preliminary findings, after which the Commission could formalize charges or adjust its stance. This isn’t Meta’s first DSA tangle; earlier inquiries targeted its “pay or consent” ad model, as covered by Reuters. TikTok, meanwhile, has been under fire for child safety lapses, with the EU previously probing its recommender systems.

Broader implications ripple through the tech sector. Rivals like X (formerly Twitter) and Google are watching closely, as similar probes could follow. Analysts at firms like Forrester predict that sustained enforcement might force platforms to overhaul user interfaces and data-sharing protocols, potentially increasing operational costs by 10-15% in Europe.

At the heart of these charges lies a tension between innovation and regulation: while Meta and TikTok argue that overly prescriptive rules stifle growth, EU officials counter that unchecked platforms exacerbate societal harms, from election interference to mental health issues among youth. This case exemplifies the bloc’s ambition to lead global tech governance, influencing standards beyond its borders and prompting U.S. regulators to consider analogous measures under frameworks like the Kids Online Safety Act.

If upheld, the fines could total in the billions, but more critically, they signal a new era of proactive oversight. As one Brussels-based lobbyist put it anonymously, “This is the EU flexing its muscles—tech giants can’t just pay lip service anymore.” The outcome may reshape how social media operates worldwide, balancing free expression with robust safeguards against illegal content.

Subscribe for Updates

ContentMarketingNews Newsletter

The ContentMarketingNews Email Newsletter is your go-to resource for the latest in content marketing. Perfect for marketing professionals looking to boost engagement and drive business growth.

By signing up for our newsletter you agree to receive content related to ientry.com / webpronews.com and our affiliate partners. For additional information refer to our terms of service.

Notice an error?

Help us improve our content by reporting any issues you find.

Get the WebProNews newsletter delivered to your inbox

Get the free daily newsletter read by decision makers

Subscribe
Advertise with Us

Ready to get started?

Get our media kit

Advertise with Us