Transforming Digital Accountability in Europe
The European Union's Digital Services Act represents a watershed moment in digital regulation, establishing comprehensive rules for how online platforms operate within the bloc. Entering into force in November 2022 and becoming fully applicable in February 2024, the DSA introduces rules for online services used by European citizens in their everyday life, including marketplaces, social media networks, app stores, and online travel and accommodation platforms.
Understanding the Digital Services Act
The DSA modernizes the decades-old Electronic Commerce Directive from 2000, creating a unified framework for digital services accountability across all 27 EU member states. The main goal is to create a digital space that respects citizens and consumers' fundamental rights while enabling smaller platforms and startups to scale up within Europe.
The regulation applies to all digital intermediary services connecting users to content, products, and services in the EU single market, regardless of where the companies are based. This extraterritorial reach means that American, Asian, and other non-EU tech companies must comply if they serve European users.
A Tiered Regulatory Approach
One of the DSA's defining features is its proportional, risk-based framework that scales obligations according to service size and societal impact.
All Intermediary Services must meet basic requirements including establishing procedures for handling takedown notices, informing users about content decisions, and addressing complaints. They must also publish annual transparency reports and appoint points of contact for authorities.
Online Platforms face enhanced duties beyond the baseline requirements, including more detailed reporting obligations and stronger content moderation procedures.
Very Large Online Platforms and Search Engines (VLOPs and VLOSEs) encounter the most stringent requirements. These are platforms with over 45 million monthly users in the EU, corresponding to approximately 10% of the EU population. The European Commission has designated 25 services as VLOPs or VLOSEs as of late 2024, including major names like Facebook, Instagram, TikTok, X (formerly Twitter), Amazon, AliExpress, Shein, Temu, and several adult content platforms.
Key Obligations for Large Platforms
The largest platforms face unique responsibilities that reflect their outsized influence on public discourse and commerce. They must conduct comprehensive risk assessments identifying potential harms, including the spreading of illegal content, threats to fundamental rights such as freedom of expression, threats to media freedom and pluralism, public security and electoral processes, gender-based violence, public health, protection of minors and physical and mental wellbeing.
Once risks are identified, platforms must implement concrete mitigation measures subject to independent audits. They must also provide users with the option to experience their services without algorithmic recommendations, giving people more control over their digital environment.
Platforms accessible to minors are prohibited from showing targeted advertising based on the use of minors' personal data, representing a significant protection for young users in an era of pervasive data collection.
Transparency as a Core Principle
Transparency runs throughout the DSA as a fundamental mechanism for accountability. The European Commission has launched the DSA Transparency Database, which tracks content moderation decisions across platforms. From 17 February 2024, all providers of online platforms, with the exception of micro and small enterprises, must submit data on their content moderation decisions.
Large platforms must disclose information about their algorithmic systems, provide details about their advertising practices, and explain the languages their content moderators speak. This granular transparency aims to enable meaningful oversight by regulators, researchers, and civil society.
The law also grants researchers legally guaranteed access to platform data, creating pathways for independent scrutiny of how these systems function and affect society. This represents a significant departure from the previous era when platforms controlled data access entirely on their own terms.
Content Moderation and User Rights
The DSA establishes clear procedures for handling illegal content while protecting free expression. It creates what's known as a notice-and-action regime, requiring platforms to act on reports of content that violates EU or member state law. Crucially, ISPs have no general obligation to monitor content and are generally exempt from liability even where they voluntarily take steps to detect, identify or remove illegal content.
Users gain new rights when their content is restricted. Platforms must provide detailed explanations when they block accounts or remove content, and users can challenge these decisions through appeals processes. Member states will designate "trusted flaggers" with special status to identify illegal content efficiently.
Enforcement and Penalties
The DSA creates a novel enforcement architecture. Each EU member state has designated a Digital Services Coordinator (DSC) responsible for supervising smaller platforms based in their territory. Ireland's Coimisiún na Meán serves as the Irish DSC, for instance.
The European Commission retains direct enforcement powers over VLOPs and VLOSEs. As of November 2025, the European Commission has started 14 investigations into DSA compliance of VLOPs or VLOSEs, with platforms under ongoing or finished investigations including AliExpress, Facebook, Instagram, Temu, TikTok, and X, as well as a number of pornographic platforms.
The financial stakes are substantial. Companies with more than 45 million active users may get fines of up to 6% of their global turnover for violations. The Commission has already demonstrated its willingness to use these powers, imposing a €120 million fine on X for breaching transparency obligations.
Real-World Impact and Controversies
The DSA has already produced tangible results. In August 2024, the Commission accepted commitments by ByteDance to permanently withdraw TikTok Lite from the EU markets, following concerns that reward features could be addictive for minors and harm mental health.
The regulation has also sparked political debate. In 2025, several officials in the Trump administration, most notably JD Vance, began alleging the DSA was being used for "censoring free speech and targeting political opponents". These claims were contested by former U.S. Ambassador to Russia Michael McFaul and German Defense Minister Boris Pistorius.
Some researchers have raised concerns about over-removal, suggesting platforms may delete more content than necessary to avoid regulatory penalties. Others worry about potential censorship risks if regulations are misapplied to legitimate journalism or political dissent.
Election Integrity Focus
With democratic processes under increasing digital pressure, the DSA includes specific provisions for protecting electoral integrity. In March 2024, the Commission published guidelines under the DSA for the mitigation of systemic risks online for elections, with particular attention to the European Parliament elections.
In February 2025, the Commission endorsed the integration of the voluntary Code of Practice on Disinformation into the Digital Services Act framework, making disinformation standards a benchmark for DSA compliance.
Looking Ahead
The DSA represents an ambitious attempt to bring democratic accountability to digital spaces while preserving innovation and fundamental rights. Its success will depend on effective implementation by member states, genuine cooperation from platforms, and active participation from civil society and researchers.
As the law matures, its impact may extend beyond Europe's borders. Companies may choose to implement DSA-compliant features globally rather than maintaining separate systems for different markets, potentially exporting European regulatory standards worldwide through what scholars call the "Brussels effect."
The regulation also provides a potential model for other democracies grappling with platform governance challenges. As of late 2024, the DSA's transparency provisions and risk-based approach are being studied by policymakers globally as they consider their own regulatory frameworks for the digital age.
For European users, the DSA promises greater control over their online experiences, more transparent content moderation, and stronger protections for vulnerable groups. Whether these promises materialize in everyday digital life will become clearer as enforcement actions multiply and platforms adjust their practices to this new regulatory reality.

Comments
Post a Comment