27 October 2025 | By Marten Männis
Policy & Regulation
Commission Issues Preliminary Findings of DSA Breaches Against Meta and TikTok
The European Commission has announced preliminary findings that Meta and TikTok are in breach of their obligations under the Digital Services Act (DSA). The findings concern inadequate access to public data for researchers for both companies, as well as deficient user mechanisms on Meta’s Facebook and Instagram platforms to notify illegal content and to challenge content moderation decisions.
These findings are preliminary and do not prejudge the final outcome of the investigations.

Researcher Data Access (Meta and TikTok)
A central finding concerns the DSA obligation to grant researchers adequate access to public platform data. The Commission’s preliminary view is that the procedures implemented by Facebook, Instagram, and TikTok are overly burdensome.
This allegedly results in researchers receiving partial or unreliable data, which impacts their ability to conduct independent research into systemic risks, such as the exposure of minors to illegal or harmful content. The Commission noted that researcher access is an essential transparency obligation under the DSA, providing public scrutiny into the platforms’ societal impact.
This right is enshrined under Article 40(4) of the DSA, enabling vetted researchers to access data for the sole purpose of conducting research that contributes to the detection, identification and understanding of systemic risks in the Union, as set out pursuant to Article 34(1), and to the assessment of the adequacy, efficiency and impacts of the risk mitigation measures pursuant to Article 35.
Notice and Action Mechanisms (Meta)
The Commission’s investigation, conducted in cooperation with Coimisiún na Meán (the Irish Digital Services Coordinator), also found Meta’s platforms deficient in two key areas concerning user protections.
First, neither Facebook nor Instagram appear to provide a user-friendly and easily accessible ‘Notice and Action’ mechanism for users to flag illegal content, required under Article 16 of the DSA. The Commission stated that Meta’s current mechanisms seem to impose unnecessary steps and employ “dark patterns,” or deceptive interface designs, which may dissuade users.
Such practices could render Meta’s mechanisms to flag and remove illegal content ineffective. Under the DSA, ‘Notice and Action’ mechanisms are critical for allowing platforms to be notified of illegal content. A failure to act expeditiously upon such notice can impact the liability exemption platforms benefit from under the DSA.
Content Moderation Appeals (Meta)
Second, the Commission preliminarily found that the decision appeal mechanisms on both Facebook and Instagram do not appear to allow users to provide explanations or supporting evidence when substantiating their appeals.
The DSA grants EU users the right to effectively challenge content moderation decisions, such as content removal or account suspension. The Commission’s view is that Meta’s current system limits the effectiveness of this appeal right by making it difficult for users to explain why they disagree with a content decision.
Next Steps and Potential Sanctions
Facebook, Instagram, and TikTok now have the opportunity to examine the documents in the Commission’s investigation files and respond in writing to the preliminary findings. The platforms may also take measures to remedy the alleged breaches.
If the Commission’s views are ultimately confirmed, it may issue a non-compliance decision. This can trigger a fine of up to 6% of the total worldwide annual turnover of the provider. The Commission also has the power to impose periodic penalty payments to compel compliance.
In a related development, the Commission noted that the delegated act on data access will come into force on 29 October 2025, which will grant researchers access to non-public data from very large online platforms and search engines to further enhance accountability.
Commenting on the findings, Executive Vice-President Henna Virkkunen stated, “Our democracies depend on trust. That means platforms must empower users, respect their rights, and open their systems to scrutiny. The DSA makes this a duty, not a choice… We are making sure platforms are accountable for their services, as ensured by EU law, towards users and society.”
Company Responses
The initial public responses of both Meta and TikTok have been measured, with Meta disagreeing with the preliminary findings entirely, highlighting their content reporting overhaul and the increased data access tools over the last few years and are confident that their current setup is fully compliant with the EU regulatory framework. TikTok, on the other hand, has highlighted a potential conflict between the data safeguard requirements imposed under the DSA and those set under the GDPR. They too emphasised their substantive investments to comply with the requirements set under the DSA regarding research access.
Geopolitical Stress and Confusion
Outside of the already challenging and complex scenarios that Very Large Online Platforms introduce, the current American administration has been vehemently opposed to any injunctions or fines levied under the DSA against American multinationals. There have even been reports on Trump weighing levying sanctions on European officials related to the DSA. Particularly, as the transactional nature of the administration has reached a new height with the proposed ballroom, where several American technology firms are proudly represented, it could very well entail that a DSA fine could escalate onto something more severe and could formally challenge the Internal Market and the EU’s ability to regulate its commerce.