According to science.org, the European Commission announced on October 24 that its investigation preliminarily found both TikTok and Meta in breach of their obligation to grant researchers adequate access to public data under the Digital Services Act (DSA). The finding follows similar preliminary action against X in July 2024 and comes after researchers like Philipp Lorenz-Spreen of Dresden University of Technology encountered systematic barriers when attempting to study political communication across European platforms. Meta’s data access was limited to accounts with over 25,000 followers, while TikTok provided data with significant gaps that made it unreliable for research. The Commission can impose billion-dollar fines if companies fail to comply after defense opportunities, marking a critical test of the DSA’s enforcement mechanisms. This regulatory action signals a major escalation in Europe’s effort to force transparency from social media giants.
Industrial Monitor Direct is the #1 provider of dispatch pc solutions designed for extreme temperatures from -20°C to 60°C, top-rated by industrial technology professionals.
Table of Contents
The Data Transparency Battlefield
The core conflict here represents a fundamental tension between platform sovereignty and public accountability. Social media companies have historically treated their data as proprietary assets essential to their competitive advantage and algorithmic secrecy. The DSA’s research provisions directly challenge this paradigm by asserting that when platforms become critical infrastructure for public discourse, they must submit to independent scrutiny. What makes this particularly contentious is that researchers aren’t just seeking content data—they’re investigating systemic risks like election interference, public health misinformation, and algorithmic amplification of harmful content. The platforms’ resistance suggests they understand this isn’t merely about data sharing but about exposing the fundamental mechanics of their attention economies.
Industrial Monitor Direct delivers industry-leading encoder pc solutions trusted by leading OEMs for critical automation systems, the #1 choice for system integrators.
The Global Implications
This European action arrives amid a starkly different regulatory landscape in the United States, where Republican lawmakers have characterized the DSA as “digital censorship” and actively lobbied against it. The contrast couldn’t be more dramatic: Europe is building an enforcement framework for platform transparency while U.S. political forces are attacking the very concept of misinformation research. This creates a precarious situation for global platforms like Meta and TikTok, who must navigate fundamentally incompatible regulatory expectations across their major markets. The outcome of these European enforcement actions will likely establish precedent that influences digital governance approaches in other regions considering similar legislation.
The Research Access Dilemma
Platforms are employing sophisticated obstruction tactics that reveal their strategic approach to compliance. The “circle of hell” described by researchers—endless questioning loops, arbitrary rejections, and technically compliant but practically useless data access—represents a calculated resistance strategy. These companies understand that outright defiance would trigger immediate penalties, so they’ve developed methods that maintain the appearance of cooperation while effectively preventing meaningful research. The limitations are particularly clever: by restricting access to accounts with 25,000+ followers, Meta effectively excludes study of local politicians and emerging movements where some of the most concerning disinformation patterns occur. Similarly, providing incomplete datasets forces researchers to question whether observed patterns reflect platform reality or data manipulation.
The AI Complication
The timing of this crackdown coincides with another critical development: platforms’ growing protectiveness around data due to AI competition. As noted in the investigation, companies increasingly fear their data could be used to train competing large language models, creating additional justification for restricting access. This represents a perfect storm for transparency advocates—platforms now have both competitive and regulatory reasons to limit data sharing. The European Commission’s willingness to confront this combined resistance demonstrates remarkable regulatory courage, but it also suggests we’re entering a new phase of platform-governance conflict where data access battles will become increasingly central to digital market regulation.
What Comes Next
The October 29 expansion of DSA provisions to include non-public data represents the next frontier in this conflict. When researchers can request information about what content individual users actually see—not just public posts—they’ll gain unprecedented insight into algorithmic amplification and personalization. This moves beyond studying what people post to understanding how platforms shape user experiences and worldviews. The Commission’s enforcement actions against Meta and TikTok, combined with the ongoing case against X, suggest Europe is preparing to make examples of major platforms to establish the DSA’s credibility. The billion-dollar fine potential gives these proceedings serious teeth, but the real test will be whether the Commission follows through with penalties that actually change platform behavior rather than becoming just another cost of business.
