On 5 December 2025, the European Commission issued its first-ever fine under the Digital Services Act (DSA). X, formerly Twitter, was fined €120 million for three specific violations of the EU’s platform regulation rulebook.
This decision matters beyond X. It shows how the Commission interprets its enforcement powers and what it expects from large platforms operating in Europe.
A quick clarification: The DSA is not data protection law. It regulates online platforms and digital services, covering content moderation, advertising transparency, and user safety. This is separate from GDPR, which governs personal data. For very large platforms like X, the enforcer is the European Commission itself, not national data protection authorities.
What X Was Fined For
The Commission cited three distinct failures in its decision.
Deceptive Verification Design
X’s paid “blue checkmark” system allows anyone to obtain “verified” status without meaningful identity verification. The Commission found this creates confusion about whether accounts are genuinely who they claim to be.
Users see the blue checkmark and reasonably assume the account holder’s identity has been confirmed. In practice, the checkmark simply indicates a paid subscription. This exposes users to impersonation scams and fraud.
The Commission ruled this violates DSA Article 25, which prohibits deceptive design practices, commonly known as dark patterns. Dark patterns are interface designs that manipulate users into unintended actions or misunderstandings.
Advertising Repository Failures
The DSA requires large platforms to maintain transparent advertising repositories. X’s repository fell short of these requirements.
The Commission found critical information missing: ad content, topics, and the identity of paying entities. It also noted deliberate access barriers, including excessive processing delays that made the repository difficult to use in practice.
These failures undermine the ability of researchers and civil society groups to detect scam advertisements and track advertising trends across the platform.
Researcher Data Access Blocked
DSA Article 40 requires very large platforms to facilitate researcher access to data for studying “systemic risks.” These are threats platforms may pose to public health, electoral processes, civic discourse, or protection of minors.
X’s terms of service prohibited researchers from independently accessing public data. The Commission found these restrictions created unnecessary barriers to legitimate research, directly contradicting the platform’s legal obligations.
What This Signals for Platform Compliance
The Commission’s reasoning in this decision suggests how it may approach future enforcement. These are observations, not binding precedents for all platforms.
Monetising Trust Signals Carries Risk
The Commission treated X’s paid verification system as inherently misleading. Platforms that monetise trust indicators without actually verifying user identity may face similar scrutiny.
The reasoning suggests that verification features should do what users reasonably expect: confirm that an account belongs to who it claims to represent.
Technical Compliance Is Not Enough
Having an advertising repository was insufficient. The Commission examined whether the repository was practically accessible and usable.
Excessive delays, missing information, and technical barriers were all cited as failures. Platforms considering their own compliance may want to assess not just whether they have met formal requirements, but whether the resulting systems work as intended.
Research Access Obligations Will Be Enforced
The Commission demonstrated it is willing to enforce researcher access provisions. Terms of service that block DSA-mandated research access were treated as non-compliant.
This applies specifically to very large online platforms (VLOPs), defined as platforms with 45 million or more monthly EU users.
What Happens Next
X now faces specific deadlines. The company has 60 working days to remedy its verification system design and 90 working days to submit an action plan addressing the advertising repository and researcher access issues.
X may appeal the decision to EU courts. Meanwhile, two more Commission investigations remain ongoing, examining how X handles illegal content and how its algorithm makes recommendations.
Continued non-compliance could result in further fines of up to 6% of global annual turnover.
Key Takeaways
The first DSA fine demonstrates that the Commission will actively enforce its platform regulation rules. Dark patterns, transparency failures, and access restrictions were all penalised in a single decision.
This is platform regulation, not data protection. Different rules apply, and a different enforcer is involved. Organisations should not conflate DSA compliance with GDPR compliance. For more on how EU regulations are evolving, see our article on whether GDPR is really on the chopping block in the EU.
If your organisation operates a large digital platform serving EU users, this decision suggests it may be worth reviewing your approach to verification features, advertising transparency, and researcher data access. The Commission has now shown it is prepared to act.
