Case Report: Online Marketplaces Are Now Frontline Privacy Actors (Russmedia Digital and Inform Media Press)
The judgment establishes online marketplaces as data controllers responsible for detecting and preventing unlawful personal data in user advertisements.
European digital platforms have long treated user advertisements as harmless content that simply passes through their systems, yet this assumption has now reached its limit. The judgment in Russmedia Digital demonstrates that platforms have operated with an inflated sense of detachment from the personal data they publish. Courts are no longer willing to accept claims of neutrality when the harm caused to individuals is real, foreseeable, and preventable.
When Online Platforms Become Data Controllers
On 2 December 2025, the Court of Justice of the European Union delivered a significant judgment in Case C-492/23, Russmedia Digital and Inform Media Press. The Court’s decision marks an important clarification of responsibility under European data protection law.
The case establishes that the operator of an online marketplace must be regarded as a controller under the General Data Protection Regulation when personal data appear in advertisements published on its platform.
The case arose in Romania, where a woman discovered an online advertisement offering sexual services under her name, using photographs and a telephone number without her consent.
The advertisement was placed on the website www.publi24.ro, operated by Russmedia Digital.
Although the company removed the advertisement within an hour of being notified, the material had already spread to other websites and continued to cause harm.
The woman brought proceedings for the violation of her privacy, honour, and personal data rights.
The local court awarded her damages, but an appellate court overturned that decision, reasoning that the platform was merely a hosting provider with no direct control over the content uploaded by users.
The Romanian Court of Appeal then referred the matter to Luxembourg for a preliminary ruling, asking the Court of Justice to clarify the obligations of an online marketplace under the GDPR and whether such an operator could rely on the liability exemptions of the EU e-Commerce Directive.
Interested in reading more cases about the application of EU law? Check out this case report newsletter:
Defining Responsibility under the GDPR
The Court’s ruling provides a decisive interpretation of who controls personal data in online advertising.
It holds that an operator such as Russmedia Digital is not merely a passive intermediary but a controller within the meaning of Article 4(7) of the GDPR.
This status arises because the platform determines the technical and organisational means by which advertisements are published and made accessible to the public.
The advertisements exist on the internet solely through the platform’s intervention.
Accordingly, the operator bears responsibility for ensuring that personal data contained in advertisements are handled lawfully.
This includes verifying whether the advertiser has the right to publish such data or whether the data subject has given explicit consent.
In the present case, the Court ruled that the marketplace must identify, before publication, any advertisement that includes sensitive data such as photographs, personal contact details, or references to sexual life.
Under Article 9 of the GDPR, such data receive heightened protection. Processing them is generally prohibited unless one of the narrowly defined exceptions applies.
The Court explained that a platform must put in place adequate technical and organisational measures to detect advertisements containing sensitive data before they go online.
These measures must also verify the authenticity of the advertiser’s identity. If the advertiser is not the person featured, the platform must ensure that the data subject has given explicit consent to publication.
To learn more about the operation of the GDPR in practice, check out our newsletter:
When neither condition is met, the operator must refuse to publish the advertisement. These obligations are not optional compliance suggestions; they form part of the core duties attached to the status of a data controller.
The decision demonstrates how the concept of control under the GDPR extends beyond traditional database management.
A platform’s active role in structuring, publishing, and disseminating user content transforms it from a neutral intermediary into an entity that participates in the determination of processing purposes and means.
The Court’s reasoning reinforces the principle that control follows function rather than form.
The question is not what the operator claims to be but what it does in practice.
Rejecting the e-Commerce Directive’s Safe Harbour
The company argued that it was entitled to rely on the liability exemptions under Articles 12 to 15 of the e-Commerce Directive, which protect providers of information society services from being held liable for unlawful user content when they act as neutral hosts.
The Court rejected that argument.
It found that the operator’s duties under the GDPR cannot be displaced or diminished by the Directive’s safe-harbour provisions.
The Court reasoned that the e-Commerce Directive governs liability in the context of information society services but does not affect the obligations imposed by the GDPR concerning data processing.
These are two distinct legal frameworks serving different purposes. The Directive deals with intermediary liability, while the GDPR addresses the protection of personal data.
When an online marketplace processes personal data contained in advertisements, it acts not merely as a conduit but as a controller subject to independent obligations.
Therefore, a platform cannot shield itself by asserting that it merely stores third-party content. Once it publishes or facilitates the publication of content that includes personal data, it becomes directly responsible for compliance with the GDPR.
This position aligns with the principle that fundamental rights, such as privacy and data protection, take precedence over economic exemptions designed for intermediary services.
The Court further emphasised that an operator’s responsibilities extend beyond initial publication.
It must also take reasonable steps to prevent the further dissemination of unlawful content. When sensitive data are unlawfully published, the operator should implement measures to prevent those materials from being copied or republished elsewhere.
Do you want to know how search engine platform regulations impact us? Our newsletter explains the intricacies of these regulations.
The duty to act does not end once an advertisement is taken down; it includes ongoing prevention through appropriate security and technical safeguards.
This interpretation expands the operational expectations placed on online marketplaces. They are no longer viewed as passive technical platforms but as active participants in the publication and persistence of content that contains personal data.
The ruling sets a higher bar for compliance by tying the concept of responsibility to the real impact of data processing on individuals’ rights.
Implications for Online Platforms and Compliance Practice
The judgment carries direct consequences for online marketplaces, advertising platforms, and digital intermediaries across the European Union.
By categorising platform operators as data controllers when they publish user-generated advertisements containing personal data, the Court has introduced a tougher standard of care.
Operators must design internal processes capable of detecting sensitive content before it is published. This obligation requires both technical and procedural innovation.
Platforms will need automated systems capable of identifying personal data such as names, phone numbers, or images, combined with manual review for complex cases.
In addition, there must be reliable mechanisms for verifying that the advertiser is the data subject or that explicit consent has been granted.
These steps may appear demanding, but they represent the necessary cost of handling personal data in a digital environment governed by fundamental rights.
The judgment also highlights the importance of transparency and traceability.
Platforms should maintain records of verification procedures, consent documentation, and the steps taken to prevent unauthorised reuse of sensitive materials.
When disputes arise, these records will serve as evidence of compliance.
Furthermore, the Court’s reference to preventing further publication introduces a forward-looking duty. Once a platform becomes aware that an advertisement contains unlawful personal data, it must act not only reactively but also proactively.
Technical safeguards such as content fingerprinting, URL blocking, or watermarking may help limit replication. However, each platform will need to determine proportionate measures that align with the GDPR’s accountability principle.
This ruling effectively narrows the margin of neutrality previously enjoyed by hosting providers.
Operators must reassess their classification under data protection law and adapt compliance programmes accordingly.
In practice, many intermediaries will now fall within the controller category for at least part of their operations.
The Russmedia Digital case thus redefines the balance between innovation, user autonomy, and protection.
It invites operators, policymakers, and citizens to understand that personal data protection is not an external constraint on digital activity but a foundational condition for its legitimacy.
The message is clear: accountability in the digital era begins with recognising that every act of publication is also an act of data processing, and with that recognition comes responsibility.
This judgment invites reflection on the responsibilities that digital platforms now carry. You are welcome to share thoughts or practical experiences that might enrich future discussions and help build stronger understanding of accountability in online environments.




