Picture: olrat / shutterstock.com

Sensitive data? Platform = jointly responsible!

Anyone who places an advertisement on an online marketplace must expect the platform to take a closer look—not voluntarily, but by court order. The European Court of Justice (ECJ) has clarified in a new ruling that online marketplaces such as eBay, classified ads, or regional portals may not simply wave through advertisements containing sensitive data.

The ruling from December 2025 marks a significant turning point: platform operators must review content actively and preventively. It is no longer sufficient to simply react when something is reported. Instead, they must check before publication whether an advertisement contains information about sexual orientation, political views, or health—and if so, whether this information may be published at all.

The case: Sex ad with real data – without consent

The ruling was triggered by a case in Romania. An advertisement was placed on the publi24.ro platform offering a woman's services for alleged sexual services, using real photos and her phone number—without her knowledge or consent. Although the operator, Russmedia Digital, quickly removed the ad, the damage had already been done: the ad had been copied to other sites and could still be found online.

The woman concerned filed a lawsuit, and the case eventually ended up before the ECJ. The central question was: Does the platform bear joint responsibility, even though it did not create the advertisement itself? The answer: Yes.

GDPR beats e-commerce directive

The Court clarified that the platform operator is the "controller" within the meaning of the GDPR—even if it "only" provides the technical infrastructure. Operators must ensure that no unlawful processing of sensitive data takes place. This means that advertisements must be checked before publication.

Exciting: Until now, platforms could often invoke the so-called e-commerce directive, which provides for exemption from liability under certain conditions—for example, if content comes from users and is deleted immediately as soon as it is reported. But the ECJ has now made it clear that this excuse no longer applies in the area of data protection. The GDPR takes precedence.

Platform operators must also take technical protective measures to ensure that such advertisements cannot simply be copied and redistributed. This makes it clear that responsibility does not end with the delete button.

What this means for platforms

The ruling is likely to cause nervousness among many online marketplaces. This is because it demands nothing less than a paradigm shift: anyone who allows sensitive user data on their platform must proactively ensure its protection—even if the content originates from third parties.

Particularly relevant: Operators must check whether the person depicted or named is actually the advertiser—or whether consent for publication has been obtained. If consent is not given, the advertisement must be rejected.

This will fundamentally change the processes of many platforms—and possibly also significantly reduce the number of ads.

Classification: Platforms have a responsibility

The ruling hits the nail on the head: whoever provides the framework for publications must also take responsibility for their content. It is no longer acceptable for platforms to hide behind technical excuses while real people are publicly exposed by false reports. But it is also clear that if every report has to be checked for sensitive content, it will be expensive, slow, and annoying. For providers. For users. For everyone. But data protection is not a matter of choice. And sometimes clear rulings are needed to finally stop the internet from being treated as a legal vacuum.

Subscribe to the newsletter

and always up to date on data protection.