15 May 2025 | By Marten Männis
DMA/DSA
TikTok Under Heavy Scrutiny in the EU
The European Commission has escalated its regulatory scrutiny of TikTok, with multiple investigations now converging under the Digital Services Act (DSA) and the General Data Protection Regulation (GDPR). What began as concern over data privacy has expanded into a broad inquiry into algorithmic manipulation, advertising transparency, electoral interference, and systemic platform risks. The series of formal proceedings against the China-owned app mark as the first real tests of technology and modern functionalities colliding with geopolitics, as the EU is growing its resolve to impose sovereignty and accountability on digital platforms operating within its borders.

Advertising Transparency: A DSA Benchmark Test
On 15 May 2025, the Commission issued preliminary findings declaring TikTok in breach of its DSA obligation to provide a functional, transparent advertisement repository. The repository is meant to allow public scrutiny of advertising campaigns, enabling researchers and civil society to detect scams, influence operations, and election interference. TikTok, however, failed to disclose critical information: ad content, audience targeting parameters, and sponsors. Even more concerning, the repository’s search functionality is inadequate, hampering meaningful oversight.
This failure is not merely procedural. As Commission Executive Vice-President for Tech Sovereignty, Security and Democracy Henna Virkkunen put it, “Transparency in online advertising — who pays and how audiences are targeted — is essential to safeguarding the public interest.” TikTok’s non-compliance obstructs the EU’s ability to detect manipulation attempts, including hybrid threats and false advertising campaigns – elements that have unfortunately massively intensified in Europe over the last decade and have been extremely successful in several European Member States. If confirmed, the breach could result in a fine of up to 6% of ByteDance’s global revenue, alongside periodic penalties and enhanced monitoring.
Repeated Failures in Protecting Minors and Algorithmic Design
This is not TikTok’s first brush with EU regulators. Back in February 2024, the Commission launched its first formal DSA investigation into TikTok, identifying a wider array of concerns. Chief among them were the app’s addictive algorithmic design, the so-called “rabbit hole” effect, and the inadequacy of its age verification systems. The Commission flagged insufficient protections for minors and raised doubts about the platform’s default privacy settings for underage users.
These concerns reinforce wider criticisms that TikTok’s interface encourages compulsive usage and fails to implement meaningful controls on exposure to harmful content. In short, the platform’s entire architecture may be structurally misaligned with the regulatory mandates of the DSA. As the Commission continues to probe these areas, TikTok’s position grows more precarious, especially given the scope and importance of these child safety provisions in the EU’s regulatory agenda.
€530 Million Fine Over Data Transfers to China
While the DSA investigations continue, TikTok has already been hit with a massive €530 million fine by Ireland’s Data Protection Commission (DPC), marking one of the highest GDPR penalties to date. The DPC found that TikTok unlawfully transferred European user data to China and failed to meet transparency requirements regarding these transfers. Despite TikTok’s protests — including claims that no data was accessed by Chinese authorities — the DPC’s report noted that data had been stored on Chinese servers and that TikTok initially misled regulators about this fact.
This ruling has broader implications. As TikTok stated, “This decision has implications not just for TikTok, but for any company in Europe operating globally.” It should be stressed that the company did mislead investigators about its processes and most probably to this day has not fully communicated its relevant procedures to the Irish authorities. Due to the company’s questionable actions, TikTok’s efforts to bolster its privacy credentials — notably via its €12 billion “Project Clover” — have not been sufficient to avoid regulatory sanction.
Threats to Electoral Integrity: The Romanian Precedent
In parallel, the Commission has launched another formal DSA proceeding targeting TikTok’s role in election interference. Following signs of foreign manipulation during Romania’s 2024 presidential election, the Commission is now investigating TikTok’s recommender systems and its political ad policies. The inquiry seeks to determine whether TikTok took adequate steps to mitigate risks tied to inauthentic behaviour and regional manipulation of electoral content.
This investigation is underpinned by declassified intelligence reports and internal TikTok documents obtained by the Commission. With national elections ongoing across the EU in 2024–2025, the issue of electoral risk is viewed as existential. Commission President Ursula von der Leyen’s remarks were pointed: “We must protect our democracies from any kind of foreign interference… it should be crystal clear that in the EU, all online platforms, including TikTok, must be held accountable.”
The Commission’s election integrity investigation could result in further legal consequences if breaches are confirmed. The proceedings are also testing a new DSA tool: a retention order issued in December 2024, requiring TikTok to preserve data related to potential risks to democratic processes. This marks a significant expansion of the Commission’s enforcement toolkit, signalling a shift toward preemptive regulatory oversight rather than post-facto sanctions.
Pattern of Non-Compliance by non-European technology companies
These developments portray a platform which has consisently lacked in transparency, user protection, and regulatory cooperation. The European Commission and other authorities could see this as a coherent pattern: failure to ensure algorithmic accountability, reluctance to disclose ad targeting practices, and disregard for the geopolitical implications of data transfers.
For EU regulators, TikTok has become the de facto proving ground for enforcing digital sovereignty. The DSA, still in its early enforcement phase, is designed precisely to curb the asymmetry between powerful online platforms and the public interest. The multiple open proceedings, along with the whistleblower channels and new supervisory powers, could potentially empower the European Union with a rejuvinated regulatory posture. TikTok’s future in the European market — both in terms of operational structure and legal exposure — may hinge on how it responds to these ongoing probes. The real question is not whether the EU will succeed in regulating platforms like TikTok — but whether TikTok is willing or able to operate under rules that demand structural transparency, algorithmic accountability, and democratic integrity.
Critical juncture for the EU
This also comes at a critical point, as the United States is doing something new and unpredictable every day. Recently, it signed an executive order to command pharmaceutical companies to lower their prices in the U.S, as he was blaming the EU for the cost of drugs and medicine in the U.S. This has compounded to company executives pressuring the EU to make swift and radical changes to prevent companies from relocating to the U.S. Given the current dominance of American tech companies in both B2C and B2B digital services, from ad delivery to communication to office work to cloud storage and service, the Digital Regulatory package could potentially be used to further combat any additional ideas that the current American administration can bring, as it seems unlikely that Europe will be seen as a proper ally any time soon.