The Signpost

News from the WMF

The EU Digital Services Act: What’s the Deal with the Deal?


Policymakers in the European Union (EU) have finally completed their negotiations over the Digital Services Act (DSA), a regulation that aims to address the spread of illegal content online. Now they have largely agreed on the rules that will govern online content moderation. Some technicalities still have to be ironed out, but the cornerstones of the regulation are known.

The Wikimedia Foundation has been tracking the developments of the DSA since the consultation phase and before the European Commission introduced the draft proposal. We have always supported the core aim of the DSA: to make content moderation more accountable and transparent. At the same time, we have cautioned that designing regulatory structures that only fit the operating models of big, for-profit websites could have devastating consequences for not-for-profit websites like Wikipedia. The legislation will fundamentally shape how online platforms operate in Europe, and also have an impact on the rest of the world online. It is also an opportunity to protect the community-governed, public interest internet, as we asked policymakers to do through four essential measures:

  1. Rules that address the algorithmic systems and business models that drive the harms caused by illegal content.
  2. Requirements for transparent and equitable terms of service without overly-prescriptive rules on how they are created and enforced.
  3. Rules on the processes for identifying and removing “illegal content” must allow user communities to participate.
  4. Rules that do not force platforms to substitute the work of people with algorithms when it comes to moderating content.

While the DSA, to a certain degree, distinguishes between centralized platforms and those that are community-governed, some concerns remain. Here is how the final outcomes stack up to our requests.

1. Does the DSA address the business models and algorithmic systems that drive the harms caused by illegal content?

The DSA framework was largely designed to address the rapid and rampant spread of illegal content online by defining some of the processes through which online content is removed. It is our position that regulations need to target the causes, not the symptoms, of the spread of illegal content: i.e., the targeted advertising business model and algorithms driving profit for ad placement. Focusing on these aspects both gets at the root problem of the issue and avoids regulating non-for-profit websites like Wikipedia as if they were operated according to the same model.

The outcomes of trilogue negotiations fall short of what we desired, but are still worth celebrating. The text bans targeted advertising that is based on sensitive information such as political, sexual or religious preferences. The EU is also banning “dark patterns”: i.e., deceptive design tactics that trick users into accepting, rather than refusing, tracking options. The DSA mandates that options to reject and accept tracking must be equally easy to select.

2. Does the DSA leave room for community-developed policies for content?

The information on Wikipedia and other Wikimedia projects is produced, maintained, and debated according to volunteer-developed rules, which are fully transparent. This volunteer-led model of content moderation has its imperfections, but it has also helped make Wikipedia a global source of neutral and verifiable information. To protect this community-governed internet, laws should not require platforms to enforce rules that are set by their users or oblige service providers to engage in top-down content moderation.

We are pleased to see that the DSA will focus only on the rules set by the service providers and their moderation obligations, leaving room for Wikimedia communities to develop and refine their own policies for content and conduct as well as to enforce them. The DSA will not prevent volunteer editors from taking care of our public interest information ecosystem.

3. To what extent does the DSA recognize user communities as part of the process for identifying and removing “illegal content”?

It is not enough for regulations like the DSA to just permit community-based content moderation: the law should explicitly promote that people, as members of our information society, play a more significant role in creating digital public spaces. While we applaud EU policymakers for recognizing that the rules of the DSA must not be articulated with only one type of platform in mind, we would have hoped for rules around the process for removal of illegal content that explicitly support community-governed content moderation systems. Even so, the regulation’s new notice-and-action regime has been vastly improved when compared to the original proposal, which could have led to Wikimedia getting constantly “trolled” by bad-faith actors. First, the service provider can determine whether or not to take action after a notice has been submitted. This is particularly important because so many of the notices that the Wikimedia Foundation receives are not about content that is actually illegal. Second, service providers retain the right to ignore notices from individuals or organizations if they consistently provide false or misleading information.

We are, however, concerned about the “crisis mechanism”, because it allows the European Commission to mandate that Very Large Online Platforms (VLOPs) tackle certain content that contributes to a threat to public health or safety. Through this mechanism — in the case that Wikimedia projects such as Wikipedia are determined to be VLOPs — the DSA essentially gives the Commission the executive power to override content moderation decisions by the Wikimedia communities. The safeguards, added after civil society organizations have voiced grave concerns, limit the potential for abuse to a certain degree — for instance, through a sunset clause and a high standard for transparency about the Commission’s demands to platforms.

4. Does the DSA enable human oversight of content moderation?

Wikimedia’s open, volunteer editing model is grounded in the belief that people should decide what information is included in the projects, and how it is presented . . . not machines or automated systems. Although the latest version of the DSA does not explicitly rule out automated tools, we find it encouraging that their use is neither explicitly mandated nor done so de facto through very short removal deadlines. The explicit prohibition of general monitoring obligations further alleviates a persistent concern we have had: i.e., that short removal timeframes and the threat of being held liable for user-uploaded information would compel service providers to deploy algorithmic tools in order to swiftly identify and remove any and all allegedly illegal content. What comes next for the DSA?

We are looking forward to seeing the complete text, where any outstanding details have been clarified. The Parliament will vote on the consolidated version in July 2022, and once the regulation is published in the Official Journal of the European Union — the official EU gazette of record — it will come into force 20 days later. Online platforms and websites will have 15 months after that date to prepare for when the rules start to apply.

Once it becomes law, the DSA will shape communication and life online for everyone in Europe, and most likely for everyone around the world as well. The Wikimedia communities have always emphasized transparency about how their projects are built and developed. Now the DSA will make content moderation processes on many other platforms more transparent and predictable as well, which will also benefit Wikimedia editors and readers. Free knowledge as a whole and the Wikimedia projects in particular are an important part of people’s online experience. For that reason, we will continue to advocate public policy that protects and promotes them, and that empowers everyone to participate in the kind of decentralized decision making and content moderation that makes Wikipedia so successful and popular in Europe and the rest of the world.


+ Add a comment

Discuss this story

These comments are automatically transcluded from this article's talk page. To follow comments, add the page to your watchlist. If your comment has not appeared here, you can try purging the cache.
  • It's always hard to tell how much impact our public policy team has on things like this (I suspect it's not crystal clear for them, either) - they are fairly limited on editor communication to avoid poisoning the well and obviously they're the ones reporting it. That all said, communities will ultimately be happy if the one line summary of any legal changes in this vein reads "You're fine, keep acting as you are, the Foundation can buffer any changes without your notice". In this instance, this good article (with thanks to @FPutz (WMF) and JGerlach (WMF):, by the way - a nicely clear read) seems to do just that, barring the crisis circumstances. Nosebagbear (talk) 12:31, 30 May 2022 (UTC)[reply]
  • Going forward, I just thought I'd ask - will we be getting a similar article for the online safety bill in the UK? That one looks like it's going to have some really problematic bits in, Wikipedia-speaking. Nosebagbear (talk) 12:31, 30 May 2022 (UTC)[reply]
    Hi Nosebagbear! The team published an article on this, if you're interested. Cheers, 🐶 EpicPupper (he/him | talk) 22:47, 30 May 2022 (UTC)[reply]
    Thanks for flagging @EpicPupper, you beat me to it! Eventually we will also produce a deep dive analysis of the bill, as part of this series on online safety bills. These are a little more 'in the weeds', the Australia deep dive we published today lets you know what level of analysis to expect. FPutz (WMF) (talk) 15:07, 1 June 2022 (UTC)[reply]
  • Really good to hear this update, which is not something I would expect to be saying about EU legislation. The key for me is: It is our position that regulations need to target the causes, not the symptoms, of the spread of illegal content: i.e., the targeted advertising business model and algorithms driving profit for ad placement. Focusing on these aspects both gets at the root problem of the issue and avoids regulating non-for-profit websites like Wikipedia as if they were operated according to the same model. A very insightful and well-considered position from the WMF. Wikipedia should not be used as an excuse for laissez faire policy about the internet, as if it is impossible to distinguish between a non-profit volunteer-run encyclopedia and a for-profit corporate dictatorship (like Musk seeks to create with Twitter). To use an extreme example, we are not on the side of websites that knowingly host videos of sex trafficking victims being raped, but nor are we safe without some guarantee that, as long as we work to quickly remove such content, the WMF can't be held legally responsible for a bad faith actor uploading such pornography to Commons. — Bilorv (talk) 21:47, 30 May 2022 (UTC)[reply]
    Considering the progress of legislation in the rest of the West, the EU could only have done worse in legislating the Internet. (Following the notorious GDPR and Article 13, which also caused consternation among those proclaiming to uphold the rights of internet users and consumers) I'm hoping the pattern of applying different criteria to for-profit monopolists on one side, and non-profits and small businesses on the other, will spread to other countries in coming years. Daß Wölf 07:18, 5 June 2022 (UTC)[reply]
    While I certainly share your position on Article 13, GDPR is flawed but brings a set of positives - it's by no means a one-way set of laws. However, its relevance to Wikimedia are more things like the right to be forgotten, which the GDPR recitals don't set a great deal of detail on. So it's looking like it's being defined by data commissioner pseudo case-law and actual case law. On that front I suspect we'll see more and more (attempted) incursions on the France and Germany side, and fewer from others (including the UK, where though no longer EU, it's obviously in the DPA2018) Nosebagbear (talk) 08:17, 6 June 2022 (UTC)[reply]

















Wikipedia:Wikipedia Signpost/2022-05-29/News_from_the_WMF