Digital Policy
The Digital Services Act: A New Era for Online Platforms and Search Engines
On 25 April, the European Commission adopted the first designation decisions under the Digital Services Act (DSA), designating 17 Very Large Online Platforms (VLOPs) and 2 Very Large Online Search Engines (VLOSEs) that reach at least 45 million monthly active users. This marks the beginning of a new era for such service providers, as they must now comply with a comprehensive set of obligations aimed at empowering and protecting users online, including minors, with which the Commission aims to ensure greater transparency and accountability in the digital realm.
Designated Platforms and Search Engines
The designated VLOPs and VLOSEs are:
VLOPs:
- Alibaba AliExpress
- Amazon Store
- Apple AppStore
- Booking.com
- Google Play
- Google Maps
- Google Shopping
- Snapchat
- TikTok
- Wikipedia
- YouTube
- Zalando
VLOSEs:
- Bing
- Google Search
New Obligations Under the DSA
The designated platforms and search engines must comply with the full set of obligations under the DSA within four months. These obligations include:
- More user empowerment: Platforms must provide users with clear information on why they are recommended certain content, the right to opt-out from recommendation systems based on profiling, easy access to report illegal content, and limitations on advertisements based on sensitive user data.
- Strong protection of minors: Platforms must redesign their systems to ensure high levels of privacy, security, and safety for minors, ban targeted advertising based on profiling towards children, and conduct risk assessments, including potential negative effects on mental health.
- More diligent content moderation, less disinformation: Platforms must take measures to address the dissemination of illegal content online and the negative effects on freedom of expression and information, enforce clear terms and conditions, implement mechanisms for users to flag illegal content, and establish mitigation measures to combat disinformation and inauthentic use of their services.
- More transparency and accountability: Platforms must ensure their risk assessments and compliance with DSA obligations are externally and independently audited, provide access to publicly available data to researchers, publish repositories of all ads served on their interface, and publish transparency reports on content moderation decisions and risk management.
Risk Assessment and Independent Audit
Designated platforms and search engines must also identify, analyze, and mitigate systemic risks ranging from the amplification of illegal content and disinformation to the impact on freedom of expression and media freedom. They must also address specific risks related to gender-based violence online and the protection of minors and their mental health. These risk mitigation plans will be subject to an independent audit and oversight by the European Commission.
A New Supervisory Architecture
The DSA will be enforced through a pan-European supervisory architecture, with the Commission acting as the competent authority for supervising designated platforms and search engines. This will be achieved through close cooperation with the Digital Services Coordinators in the supervisory framework established by the DSA. These national authorities, which will also supervise smaller platforms and search engines, must be established by EU Member States by 17 February 2024.
European Centre for Algorithmic Transparency (ECAT) and Access to Data for Researchers
The Commission is bolstering its expertise with the recent launch of the European Centre for Algorithmic Transparency (ECAT), which will provide support in assessing whether algorithmic systems align with risk management obligations. It is also creating a digital enforcement ecosystem, gathering expertise from various relevant sectors.
A call for evidence was launched on the provisions in the DSA related to data access for researchers, aimed at better monitoring platform providers’ actions to tackle illegal content, such as illegal hate speech, as well as other societal risks such as the spread of disinformation and risks that may affect users’ mental health. Vetted researchers will have the opportunity to access the data of any VLOP or VLOSE to conduct research on systemic risks in the EU. This access will allow them to analyze platforms’ decisions on what users see and engage with online, granting access to previously undisclosed data.
In response to the feedback received, the Commission will present a delegated act to design an easy, practical, and clear process for data access while containing adequate safeguards against abuse. The consultation will last until 25 May.
Background and Broader Implications
On 15 December 2020, the Commission proposed the DSA along with the Digital Markets Act (DMA) to ensure a safer, more fair digital space for all. The DSA, which entered into force on 16 November 2022, applies to all digital services connecting consumers to goods, services, or content. It creates comprehensive new obligations for online platforms to reduce harm and counter risks online, introduces strong protections for users’ rights online, and places digital platforms under a unique new transparency and accountability framework.
Designed as a single, uniform set of rules for the EU, these rules will provide users with new protections and businesses with legal certainty across the whole single market. The DSA is a first-of-its-kind regulatory toolbox globally and sets an international benchmark for a regulatory approach to online intermediaries.
The Digital Services Act signifies a major shift in the regulatory landscape for online platforms and search engines. With its emphasis on user empowerment, protection of minors, content moderation, transparency, and accountability, the DSA sets a global standard for the regulation of online intermediaries. As designated platforms and search engines adapt to these new obligations, the digital space is expected to become safer and more equitable for all users. The DSA’s pan-European supervisory architecture, bolstered by the European Centre for Algorithmic Transparency (ECAT) and the digital enforcement ecosystem, aims to ensure that these platforms and search engines are held to the highest standard in compliance and risk management, setting the stage for a more responsible and transparent digital future.
With the designation of VLOPs and VLOSEs, we can expect the European Commission to take additional steps soon. Additional VLOPs and VLOSEs will definitely be designated in the future, extending the obligations and protections of the DSA to a wider range of digital services. Furthermore, as new technologies and digital trends emerge, the Commission will likely need to address new challenges and potential risks. For example, the increasing prevalence of artificial intelligence (AI) and machine learning in digital services may raise concerns about algorithmic bias, transparency, and accountability. The Commission will surely develop guidelines or regulations to ensure that AI-driven platforms adhere to the principles of fairness, transparency, and human rights.
In addition, given that the ultimate goal of these policy initiatives concerns empowering European citizens to successfully navigate the digital landscape safely and responsibly – the Commission may provide additional investments in digital literacy and education initiatives, equipping EU citizens with the skills and knowledge necessary to recognize and avoid risks such as disinformation, cyberbullying, and online scams. These initiatives could target both young people and adults, ensuring that all EU citizens can benefit from the digital single market while remaining safe online.