Every Upload Is Guilty Until Proven Otherwise
Someone always wants to break the internet. This time it’s the European Union, which was moving to vote on whether every upload to social networks, YouTube, and blogs would need to pass an automated copyright check—and be deleted if it failed. The proposal centered on Article 13 of a new Copyright Directive, and the scope of it was considerable.
Under the revised Article 13, sharing someone else’s image on Facebook or Twitter becomes legally fraught. Memes are suspect. Photos of current events that you didn’t personally shoot are suspect. Any video using music, footage, or ideas that might belong to a rights holder—gone. Remixes off the table entirely. The liability would fall not just on the individual poster but on platforms themselves: sites like YouTube, WordPress, or Bandcamp would be required to build automated filters to catch violations before upload, or face the consequences alongside their users.
The knock-on effects reach into unexpected places. Open-source developers contributing to projects like Firefox, Kodi, or VLC through GitHub or GitLab could find their code flagged and removed by automated systems that can’t distinguish between a copyright violation and collaborative software development. Free software stops being free in a fairly concrete sense.
The ostensible original aim of the upload filter proposal was anti-terrorism—which is always the entry point for internet control legislation. Terrorism and child exploitation are the two subjects that guarantee nobody votes against you publicly. But the scope crept, as scope always creeps, until it stopped being about terrorist content and started being about everything. This is how these things go.
And before anyone says that creators deserve control over their work—yes, some do, and some exercise that control reasonably. But some don’t. There are companies that have filed copyright claims against recordings of white noise. Literal silence. Automated systems enforcing rights claims don’t have judgment; they have pattern matching, and the false-positive rate in a world where everything gets filtered would be enormous. Photographers, journalists, musicians, writers, developers: all operating under the constant possibility that something they made gets caught in a net and deleted, with the burden of proof reversed.
For anyone making things and putting them online—and that’s what this journal has been for twenty years—this isn’t abstract policy. It’s a direct threat to the conditions that make the work possible. That’s the version of the internet Article 13 was trying to build: one where the default is suspicion, and you publish at your own risk.