Marcel Winatschek

One Saturday in Berlin, Against the Algorithm

The sad truth about internet activism is that nothing changes because you refreshed your feed for the seven hundredth time. Article 13—the EU copyright reform that would have forced most platforms to implement upload filters on user-submitted content—was the rare issue that felt worth actual physical presence, which is how I ended up standing near the Axel-Springer building on a Berlin street on the morning of March 2, 2019, holding a sign.

The march, organized by the coalition Berlin gegen 13, ran from the corner of Rudi-Dutschke-Straße and Lindenstraße through the government district to the EU Commission’s Berlin offices near the Brandenburg Gate. The symbolic geography was legible: publishing conglomerate, national government, European institution. The argument was more complicated than the meme-centric framing often suggested, but the core of it was simple enough: upload filters can’t read context.

YouTube’s ContentID system, which operates on similar logic and which the law would have essentially mandated for every platform above a certain scale, had already demonstrated this at scale: tens of thousands of videos documenting war crimes were blocked because they contained IS flags. The filter sees the flag. It doesn’t read the sentence the flag appears in. It can’t distinguish between a copyright violation and a creative adaptation, between terrorist footage and journalism about terrorism. Automated review at the scale of the internet produces automated error at the same scale.

The creators the law claimed to protect weren’t going to benefit. The financial logic of the related press publishers’ right directed revenue toward the large publishing conglomerates, not toward the writers and artists who actually made the work. Smaller platforms—unable to afford compliant filter infrastructure—would either outsource to Google, consolidating the power the law ostensibly opposed, or shut down entirely. The net effect would be fewer independent platforms and more concentrated control. Which is precisely the outcome anyone sensible claims to want to prevent.

What stuck with me then, and still does, is the infrastructure argument. A filter built to enforce copyright is still a filter. What it filters is always a political question, and that question gets asked again. The EU’s planned regulation against terrorist content was already in draft. The pipes are neutral; what flows through them is not. The people who built systems of automated content removal in the name of intellectual property don’t get to be surprised when those systems find new applications.

Article 13 passed anyway—modified in the final text, rebranded as Article 17 in the Copyright Directive, signed that April. Some Saturdays you go outside knowing it probably won’t fix everything. You know it won’t, and you go anyway.