Connected Places
by Laurens Hof

Understanding how the new social web works

Hi {name},

 

This week's piece is about an EU court ruling from December that has not gotten much attention in the protocol community despite arguably being the most significant European legal development for federated and distributed systems in years. I had not heard about it either until recently. The piece is my attempt to explain what it does, and why it matters for ActivityPub and atproto.

 

Thanks for reading!

Federation Has a European Legal Problem
May 1, 2026 - Laurens Hof
Broken red chain

Recently, the Court of Justice of the European Union (CJEU) delivered a new ruling, Russmedia, that has massive implications for how platform regulation and liability work in the EU, with measured legal analysts like Erik Tuchtfeld calling it a bombshell, with implications that hits upon the core functions of open networking protocols like ActivityPub and atproto.

One of the main outcomes of the ruling is that a Romanian online marketplace "is required to implement appropriate technical and organisational security measures in order to prevent advertisements published there and containing sensitive data, in terms of Article 9(1) of that regulation, from being copied and unlawfully published on other websites."

Sure, a Romanian marketplace did something bad, what do I care, you might think. But the problem here is two-fold: First, the ruling is written in a way that can be generalised beyond only Romanian online market places, and into pretty much all platforms that fall under GDPR and handle social data. Secondly, copying and publishing data on other websites is exactly what federation is. The function of open social networking protocols like ActivityPub and atproto at their core is to have social data be copied and published on other websites.

This is an uhhh, slight bit of a problem, when European legal rulings make it very unclear if federation itself is in compliance with GDPR.

But to understand what is actually happening here, and why it matters, I will ask you to read some explainers on the nitty-gritty of European law, which I apologise for. Also note: I am very much not a lawyer. Most of my analysis is built upon the work of legal scholar Daphne Keller, and I highly recommend both of her articles on this ruling, here and here, as well as work by Erik Tuchtfeld, Neil Brown and more. I've done my best to be as precise as possible here, and explain the context and how the EU legal system works, but there is a good chance that I've gotten stuff wrong, so keep that in mind, and feedback and corrections are much appreciated.

Most readers will know Section 230 as the foundational US platform liability law, the statute that lets American platforms host user content without being treated as publishers of everything users say. The EU's functional equivalent is a notice-and-takedown system that grants platforms conditional immunity as long as they act on notifications about illegal content, which is handled via the Digital Services Act (DSA). It does get slightly more complicated though, since Europe effectively has two different legal regimes regarding how platforms should handle user data, with the GDPR being the other one.

Daphne Keller, Director of platform regulation work at Stanford Law School, in her analysis of Russmedia, poses the following:

"Suppose I post false information about the French politician Marine Le Pen on Bluesky. If she asks Bluesky to remove my post for defaming her, EU law provides a well-defined notice and takedown process under the DSA. But if she says that the same post processes her personal data in violation of the GDPR, she can credibly argue that entirely different rules apply: that Bluesky has no immunity for unwittingly hosting false content, that it has a duty to proactively monitor and filter uploads, and that it should not notify me or allow me to appeal the removal. Russmedia will make EU member state courts substantially more likely to accept such arguments."

This gets at the core of it, and why you're now reading about European Court rulings in a blog about protocols: the ruling suggests much stronger responsibilities for social networking platforms, and the proactive monitoring and filtering of uploads is a difficult responsibility for most of the services that operate on the open procotols. That the ruling also explicitly orders a marketplace to make sure their data is not being copied to other platforms also is of the utmost relevance for networks that operate on the principle of federation.

What Russmedia actually does

The case began when an unidentified user posted a fake advertisement on Publi24, a Romanian classified ads site, falsely depicting a woman as a sex worker and including her photographs and phone number. Russmedia, the operator, removed the ad within an hour of being notified. But by then the ad had been copied to other websites, where it stayed visible. The mechanism of how the ad had been copied to other sites is unclear from the ruling, and the other platforms are not named, and whether Publi24 had given permission for the syndication remains an unresolved part of the ruling. Regardless, the woman sued, and the case eventually reached the Court of Justice of the EU. The Court ruled against Russmedia in December, holding that for content involving personal data, GDPR applies and the platform's intermediary immunity does not. Erik Tuchtfeld of the Max Planck Institute calls the decision a "bombshell" and names the resulting model "publish-and-perish": platforms become liable from the moment of publication rather than from the moment of notification.

To avoid that liability, the platform must screen content before publication. Tuchtfeld argues that the obligation to implement "technical and organisational measures" under the GDPR amounts in practice to the general monitoring obligation that both the eCommerce Directive and the DSA explicitly forbid, but labelled as a GDPR obligation. Where sensitive personal data is involved (Article 9 categories include things like ethnicity, political opinions, religious beliefs, and sexual orientation, much of what people actually post about on social platforms), the platform must verify that the poster is the data subject. Pseudonymity, the default mode of participation on both ActivityPub and atproto, becomes a compliance liability.

The ruling also requires platforms to "implement security measures such as to prevent advertisements published there and containing sensitive data from being copied and unlawfully published on other websites." This is the part where Russmedia directly impacts open social protocols, because copying and publishing data is exactly what federation and interoperability is. ActivityPub broadcasts every post to other servers as its core mechanism. Having the data be copied and published on other platforms is the entire purpose of ActivityPub. Atproto distributes content through a relay that anyone can read from, and publish on their own site as they see it. Open protocols are designed to send data to other servers, and "implementing security measures" to prevent data from "published on other websites" is how you end up with a non-functional protocol.

How the Court got there

The Court of Justice of the European Union (CJEU) is the highest court for matters of EU law, and national courts that hit a question they cannot resolve under their own law refer it to the CJEU through the preliminary reference procedure. The CJEU's ruling on the question then applies to every member state court. The Russmedia case came up this way, from the Romanian Court of Appeal. The Grand Chamber configuration the case was decided in, with 15 judges rather than the usual 3 or 5, signals that the Court treated it as a matter of significant legal weight.

EU platform law has two main regimes that run in parallel. The first is the intermediary liability framework: the eCommerce Directive (2000), now replaced for online platforms by the Digital Services Act (2024). This is the European functional analogue to Section 230, granting conditional immunity to platforms that act on notifications about illegal content. The second regime is data protection: the General Data Protection Regulation (2018), which regulates anyone who processes personal data, regardless of whether they are a platform. Most content on platforms involves personal data, which means most cases potentially fall under both regimes at once.

GDPR makes a distinction between controllers and processors that determines who carries the regulation's substantive obligations. Controllers decide the purposes and means of data processing; processors execute operations on a controller's behalf. Controllers carry almost all of the regulation's substantive obligations, while processors mostly need to follow the controller's instructions and keep things secure. Social platforms fit poorly in this distinction, because the user decides what specific content to post while the platform decides how the whole system works, including how content is monetized and distributed and what its terms of service permit. The CJEU has been steadily expanding its definition of "controller" over the last decade, and Russmedia is the latest step, meaning that more and more of a social platform's legal obligations are defined by GDPR instead of the DSA.

The Court held that Russmedia was a joint controller (with the unidentified user) for the personal data in the ad. The Advocate General had argued that Russmedia was a processor: the user determined what to post and why, and Russmedia executed the hosting on the user's behalf. The Court disagreed, pointing to four features of Publi24's operation as establishing controller status:

  • terms of service granting broad rights to use, modify, and distribute content;
  • organisation of ads into categories;
  • setting parameters for dissemination;
  • allowing anonymous posting.

The Court treated these as alternative features to determine controller status, meaning that any one of them could be sufficient. The features the Court flags are common to almost any platform, which is what makes the ruling's reach so broad.

Both regimes apply to most content on most platforms, and they were always going to collide somewhere. The legislators tried to handle this through "without prejudice" clauses in both regulations: GDPR Article 2(4) says GDPR operates without prejudice to intermediary liability rules, and the eCommerce Directive said its rules shall not apply to data protection. The clauses worked while no case forced a direct conflict, but they could not say which regime wins when one happens. Russmedia is the first major ruling that had to choose, and the Court chose GDPR. Once Russmedia is a controller, the eCommerce Directive's intermediary immunity does not protect it from GDPR violations.

What this means for open protocols

Apply the Russmedia controller test to a typical Mastodon instance:

  • The instance sets parameters for dissemination through its federation policy and relay choices.
  • It organises classification through hashtags, lists, and timeline structures.
  • It allows anonymous posting almost by default.
  • Its terms of service, often pulled from generic templates, claim broad content rights.

The only factor that offers some resistance for non-commercial instances is the "commercial purposes" element, and the Court reads that broadly. Many of these instances do not even need Russmedia to be considered controllers; their template privacy policies, often pulled from generic legal advice, declare them controllers explicitly. Neil Brown, a UK technology lawyer: "For even a small fedi instance (e.g. a Mastodon instance run by friends, for a small group of users), this seems impractical, and would require considerable design changes anyway."

Being a controller carries three requirements under Russmedia: pre-publication screening for sensitive personal data, identity verification for posters of sensitive content, and technical measures to prevent third-party copying. The first assumes automated content classification systems that Mastodon does not ship with and that volunteer admins cannot reasonably build. The second is, for many instances, an attack on the use cases that brought users to the fediverse in the first place. For the third, federation in ActivityPub is the broadcasting of posts to other servers, and preventing that copying is the requirement to stop federating. A Mastodon instance that complies with Russmedia is a Mastodon instance that does not send posts to other instances, which is to say a Mastodon instance that has stopped participating in the network.

Atproto distributes the controller problem differently because of its layered architecture, which means that Russmedia raises two questions. The first is which layers qualify as controllers under the Court's test. The second question is which layers' core functions are made impossible by the substantive obligations that fall on whoever is the controller.

On the Russmedia version of the controller test, the AppView seems likely to match. AppViews do things like building indices, organising content and setting parameters for dissemination with feeds, and operate, at least in the case of the Bluesky AppView, under terms of service for commercial purposes. The features the Court flagged in Russmedia map almost directly onto what an AppView does. If any atproto component is a Publi24 analogue, it is the AppView.

The PDS seems less likely to be classified as a controller. A PDS hosts a user's repository and serves it back through the protocol, but does not organise classification or set dissemination parameters, and a self-hosted PDS does not even sit in a service relationship with anyone other than its single user. The plausible reading is that the user is the sole controller of their own repository, with the PDS operator either a processor or pure infrastructure. Managed PDS services that host repositories for many users at commercial scale are more exposed, but the bare PDS function does not match the Russmedia template the way an AppView does.

The relay forwards a firehose to anyone who subscribes, without categorising, targeting recipients, or claiming content rights, and without a relationship to any user really. Under the controller test, a relay looks more like a network-layer passthrough than like a platform. But the substantive obligation Russmedia imposes on whoever is the controller is to prevent sensitive content from being copied to other sites. The relay's defining function is to copy content from PDSes and make it available to anyone who subscribes. So even if the relay is not itself a controller, the upstream controller's compliance obligation is to stop content from reaching the relay at all. Whether the controller in question is the AppView, the PDS, or some joint configuration, the substantive obligation is to break the protocol's transport mechanism.

The same pattern repeats regarding identity, where the Court requires the controller of a post containing sensitive personal data to verify that the poster is the data subject of that data. This is a stronger obligation than know-your-customer: it asks the controller not just to identify the user but to attest that the content the user is posting concerns the user themselves. Atproto's identity layer is not built to support either step. DIDs and handles are cryptographic identifiers tied to repository control, not to legal identity, and the protocol provides no mechanism for binding a DID to a verified human or for confirming that content within a repository is about the human who controls the repository.

Pseudonymity is one of the core features that make open social networks valuable, and many users participate at all only because they can do so without being identifiable to general audiences. The reasons for pseudonymity range from necessity, think LGBTQ+ users in places where their identities are criminalised or political dissidents, to the more frivolous, that people just enjoy being anonymous. Russmedia treats anonymous posting as a factor triggering controller status, and then requires the controller to verify poster identity when sensitive data is involved. Keller calls the logic circular: allowing anonymity makes the platform a controller; being a controller means it cannot allow anonymity. Neither atproto nor ActivityPub have supporting mechanisms for identity verification, and compliance with Russmedia would require adding identity layers that contradict both protocols' architectures.

What might limit Russmedia's reach

Whether Russmedia actually shapes how platforms operate is a different question from what the ruling says. Court rulings do not apply themselves. Regulators have to interpret them, national courts have to put them into specific cases, and new questions inevitably get referred back to the CJEU for clarification. Each of those points is somewhere the ruling can be made narrower than it currently reads.

Daphne Keller has catalogued seven theories courts could use to constrain it, like that it only applies to severely harmful content, or that it only applies to high-risk services or to marketplaces. Her own view is that none of them is individually convincing, though some combination might persuade motivated courts. Her excellent two-part analysis on TechPolicy Press is recommended for the full arguments. Keller notes that, in her review of law-firm commentary on Russmedia, none of the published takes argue that its impact could be limited to ads or marketplaces. German DPAs in Berlin and Hamburg have already signalled they will read the ruling broadly, which is a strong early signal because German enforcement tends to set precedent that other DPAs follow.

The most direct pathway from marketplace doctrine to social-platform doctrine runs through the Renate Künast case in Germany. The German Federal Court of Justice has been waiting for Russmedia before deciding whether Facebook has to prevent a misquote of the former minister from being repeatedly republished on the platform. Niko Härting, law professor in Berlin specialising in internet and data protection laws, predicted that the Federal Court of Justice will now award Künast both an injunction and GDPR damages against Meta, on the basis that Meta is a joint controller for personal data on its platform and cannot rely on the DSA's host-provider privilege. Härting's reading is that Russmedia applies straightforwardly, and he is the leading German commentator tracking the BGH's pending application of the ruling.

The core problem is that applying the Russmedia ruling to social networks produces completely incoherent results. Russmedia's verification requirement assumes a posting structure in which the poster is the data subject of the data in the post: a marketplace ad about the person posting it. On a social network, posts routinely contain sensitive data about people other than the poster, and the controller has no mechanism for identifying those third parties or contacting them to verify their consent. The same goes for pseudonymity: the controller often cannot identify the poster either, and Russmedia treats anonymous posting as a factor triggering controller status while requiring the controller to verify poster identity.

This might pose at least some potential way that the reach of Russmedia can be limited: applying the ruling to social networks both destroys the ability of these networks to function, but also just results in nonsense? Requiring a social networking platform to verify the consent of everyone involved every time someone posts sensitive data about someone else is absurd. This might theoretically function as an argument against a wide reading of Russmedia. The absurdity does not stop there, as this also directly undermines the push in the EU to become less dependent on the Big Tech platforms. Only centralised Big Tech platforms can afford compliance and liability schemes that the Russmedia ruling assumes. The European Commission is a vocal supporter of federated and open social networks, and runs their own Mastodon server. The EU loves to talk a big game about digital sovereignty, and that we can truly seriously definitely totally be independent of Big Tech. That the CJEU's ruling assumes a world in which only Big Tech platforms can exist, and in which open federated networks certainly cannot exist, is maddening, absurd, and frankly, fucking hilarious.

Russmedia is potentially as significant as a major Section 230 reform, possibly more so for the parts of the internet that depend on federated or distributed architectures, but it arrived through a court ruling without the public attention an American equivalent would have drawn. Following the protocol-policy conversation as it continues means engaging with how EU law actually works.

After cataloguing her seven theories for limiting Russmedia, Daphne Keller posted on Bluesky that there was one she had not included and probably should have: "Russmedia isn't that bad because Member State courts won't follow the CJEU's ruling." Let's hope she's right on this, because, otherwise, well, lol.

Read more...


QR code for donation to Connected Places

Liked this newsletter? Help keep it alive!

If this newsletter adds value to your week, consider supporting its creation. Your contribution keeps the ideas flowing. Click or scan the QR to donate securely.

Unsubscribe   |   Manage your subscription   |   View online