Detail of the city Luik

On the coordination for better moderation systems

Recently, IFTAS shut down their community forums, IFTAS Connect, a place for fediverse moderators for collaboration, resource sharing and mutual support. IFTAS (Independent Federated Trust and Safety) has the main non-profit organisation in the fediverse for supporting moderators and other volunteers who are working on making the open social web a safer place. While the organisation has done incredible work in the space, it has struggled to gain traction and find the necessary funding to continue its larger operations. That problem is also visible in IFTAS Connect, which “has not provided the kind of active collaboration we had hoped for.” Its not that there are alternatives to places like IFTAS Connect either, when it comes to community places for fediverse moderators there is a lack of options altogether.

In a decentralised network without any hierarchical power, growing and evolving the network depends on grassroots initiatives and the collaboration of individuals that want to contribute to the health of the network. But while many people seem to hold the idea that moderation systems in the fediverse can and should be better, initiatives for collaboration struggle to gain traction.

The basic premise of the network is simple: every fediverse server is its own social networking site, and it can connect with other fediverse server to join this larger super-network called the fediverse. This design, where every server is its own social networking site, means that every server is also responsible for their own moderation. This understanding of moderation however is mostly focused on bad behaviour that originates from the server. However, a lot of bad content originates from other servers, and gets send to the server instead.

This mismatch between the control over your own server, while harm originates from other servers, is clearly visible in CSAM. CSAM doesn’t originate from your server (hopefully), but it can still end up in your server’s database. IFTAS found a high number of CSAM (4.29 per 100k images) when they scanned eight medium-sized Mastodon servers for a pilot project. Such a pilot project self-selects for well-moderated services, and does not include well-known bad actors. The second and third largest Mastodon server are well-known to have issues with this regard, and it was as simple for as checking the local timeline of one of these servers to see an advertisement for CSAM trading. For further context, when you set up a new fediverse server, you connect (federate) with these servers by default, and it requires active choices (and knowledge) by the administrator to block these servers. There is no form of automated CSAM scanning in the fediverse. This exposes volunteer moderators to view CSAM when moderating, and leaves a significant amount of CSAM on the network undetected. Server moderators have only limited tools to deal with content that arrives from elsewhere: they can remove the content from their server, but not from the server where it originates.

The same dynamic applies to harassment. If the moderator of server A finds that a person on server B is harassing a person on server A, they have some options to protect their own community. They can block the harassing account or defederate from server B entirely. However, this still gives the harasser on server B all the space to continue their harms on people on all the other servers. The moderator on server A has no structural way of sharing their results with moderators on other servers. The only option is announcing this publicly via a hashtag. And if you defederate from that server, harassers can easily set up another server instead.

The sovereignty of servers gives people the power to be proactive about problems that originate from their own server, but by its nature it also means that reactions to problems that originate from other servers are always reactive. These are fundamentally coordination problems, that require sharing infrastructure, such as automated CSAM scanning systems, and structured ways to share information about bad actors. But building this infrastructure requires collaboration across servers and softwares. So why, after years of knowing these problems exist, have initiatives to build better coordination tools struggled to gain traction?

Moderation and coordination

There are multiple explanations as to why new moderation systems for the fediverse struggle to gain traction. One crucial lens is through race and gender. As Mekka Okereke said this week: “we often prioritize white feelings over Black safety“. For more in-depth analysis on these dynamics in the fediverse, Jon Pincus at The Nexus of Privacy has written multiple articles on this that I can recommend. This article focuses on a complementary structural reason, as even when people are largely in agreement that changes can be made, the decentralised architecture of the fediverse makes the coordination to actually implement such changes difficult.

The core challenge is that in the way the fediverse has become decentralised, a lot of system improvements depend on other actors. The fediverse gives you sovereign control over your own place on the network with your own server. It gives people the ability to set any moderation policy they want, and run any software they prefer. This is a meaningful version of autonomy, but it is limited to your own server. The challenge however is that the most challenging parts of moderation don’t respect server boundaries, and are cross-network problems.

One of the prime features of a decentralised system is that any individual node in the network often has limited to no influence over the other nodes. A decentralised systems gives individual users agency, but the main problems often don’t happen on an individual level, but are systemic in nature instead. Coordination on social networks is a famously hard problem, and best illustrated by how many people are still using X, and how challenging it is to get people to transition to other platforms. But for improved moderation systems on the fediverse there is another layer to the challenge: the coordination is now not only between people, but also between the much more abstract entities of servers and softwares.

The fediverse makes this coordination problem even more challenging, as the coordination is one step removed from most fediverse users: it is their server admins who have coordinate with other server admins. On a centralised platform, there is a single step between the users and the company in power (although it is doubtful they will take your concerns seriously). For users who feel that the fediverse needs better moderation tooling, the way to affect change is now an additional step way. Better moderation systems depend on your server admin, and/or on the developer who makes the software that your server runs on. But a user does not have a direct relationship with the software developer such as Mastodon; that is between the server admin and the software maker instead.

When someone joins Mastodon, they experience it as a social network where they follow people and post updates. The fact that their ability to shape the network depends on server admin coordination isn’t visible or explained, people largely experience the fediverse as a single network they join. But underneath that singular network there are multiple layers of abstraction, that prevent a direct path to change.

The fediverse has no natural communication channels between admins. There is no central forum, no mailing list that reaches all admins, no natural gathering place. This makes the already-difficult task of coordinating abstract entities (servers with their own governance and decision-making processes) even harder. The shutdown of IFTAS Connect is an illustration of that this a dual problem: not only are there no real communication channels for coordination that cover a large range of the fediverse, even when people attempt to build such channels the interest is limited.

This coordination problem is also visible for improvements to moderation systems for the fediverse. A system for coordination on moderation depend on other people to participate in the system. For example, both the FediMod FIRES and the Fediseer projects are systems designed to help servers share information about other actors and their (un)trustworthiness. Both tools work on the problem of coordination in the fediverse: if a server moderation determines a bad actor, how do they inform moderators on other servers of this? The problem is that these tools are only valuable if many servers participate. If only a few servers participate in such a system, then the signals provided are not useful enough or too sparse. They become valuable once a significant portion of fediverse servers participate. This is the classic coordination problem, where individual servers are not incentivised to participate with such tools until it has critical mass, but it can’t reach critical mass until many individual servers participate.

Another example is the hashtag. It was created by Marcia X when they were a moderator of the now-defunct PlayVicious instance, the first dedicated instance for Black and POC Mastodon users. As Marcia X explains it: “It started as a tool made by queer femmes to put the spotlight on a sexual harasser. Then people started framing it as, “this is just a tool for mods and admins to use to spotlight bad behavior,” which isn’t totally incorrect.” All those years later, and is still the predominant way for admins and moderators to share.

This shows two things: first of all, it is an illustration of how Black people have contributed and influenced fediverse culture for a long time. But secondly, it also shows that institutions and culture in the fediverse has a hard time changing and adopting to the needs of the ecosystem. When Marcia X created years ago, it was borne out of necessity, a simple problem for a problem that their community dealt with right at that moment. But the network has grown, and the way how admins and moderations coordinate information on harmful behaviour has not grown with it. A behaviour build around a hashtag is a prime example of a coordination problem: everyone is now using , and there is no clear mechanism to coordinate everyone to switch to a system more suitable for the current state of the network.

Better systems for moderation are not only often a collective action problem, but the way to affect chance is also limited. For if people feel that the fediverse should have better moderation systems, they need either their server admins or the software makers to participate. And if petitions to the admins or the devs fall on deaf ears, the only real option is to move to a server that does have the preferred systems. But this individual agency does not solve the coordination problem, as moving to a different server does not create network-wide change.

For the fediverse, there is a need for better moderation services. But how people on the fediverse will work together on building and adoption these moderation systems is a whole less clear.

This article was sponsored by a grant from the NLnet foundation. 

Liked this post? Consider a donation!

I value information to be free; and paywalls aren’t great. Donations is what makes my work possible, and if you are willing to support my work I would be immensely grateful.