The People Who Watch What You Refuse to See
Somebody has to watch it. Every piece of child pornography, every torture video, every beheading posted with gleeful commentary, every animal crushed under a boot for the entertainment of people you don’t want to believe exist—somebody has to look at it, assess it against a 48-page rulebook, and decide whether it stays or goes. That person earns slightly above minimum wage. In a lot of cases, that person is a Syrian refugee who couldn’t get their qualifications recognized in Germany and said yes to the only office job that would have them.
The Süddeutsche Zeitung published its investigation Inside Facebook: Im Netz des Bösen in late 2016, and what it described was a content moderation operation running out of bare offices in northern Berlin. The moderators were employed not by Facebook directly but by Arvato, a subsidiary of the German media conglomerate Bertelsmann. They sat in shifts—early and late, forty hours a week—working through a queue of flagged posts that refreshed continuously and offered no preview of what the next item would be. Reported animal abuse. Swastikas. Penises. A child.
The rules were barely comprehensible,
one worker told the paper, anonymously, as they all did. I said to my team leader: this image is totally bloody and brutal, no human being should have to see this. But he just said: that’s your opinion. You have to try to think the way Facebook wants you to think. We were supposed to think like machines.
The rules produced results that read like a dark satire of corporate logic. Decapitated bodies: usually fine. Violence against women, children, or animals: often fine. A naked breast: gone immediately, account suspended, possibly permanently. The people applying these rules weren’t monsters. They were workers on €1,500 gross per month who didn’t fully understand the framework they were implementing and knew they’d catch heat from their supervisors for over-deleting. So when something hovered in the grey zone, the incentive was to let it through.
As Max Hoppenstedt reported at Motherboard: Some were excited before their first shift about working for the world’s largest social network—today they’re complaining that they weren’t adequately trained. After logging into a Facebook-owned processing platform, thousands of reported posts waited in a queue, but what content was behind the next ticket from the internal system, the workers didn’t know in advance. It was a random selection from the queue. Animal cruelty. A swastika. Penises.
These were the FNRP teams—the lowest tier in Arvato’s hierarchy. And many of them were Syrian refugees specifically because Syrian credentials weren’t being accepted by German employers. What do you say when Facebook—or the company contracted to Facebook—offers you a steady office job? You say yes. Even if you find out on the first shift what the job actually involves.
One worker described the moment she walked out. There was a man with a child,
she said. A child of about three. The man sets up the camera. He picks up the child. And a butcher’s knife. I have a child myself. Exactly the same age. It could have been mine. I’m not going to destroy my brain for this shit job. I turned everything off and just walked out. I picked up my bag and walked to the tram stop, crying.
The deletion rules Carsten Drees reported at Mobilegeeks captured the particular madness of trying to legislate human hatred at scale. A post calling for violence against refugees: deleted. A post comparing refugees to animals or calling them subhuman: deleted. But Whites Only
—framed as inclusion rather than exclusion—passes. Not here for Black people
would be exclusion, and therefore not acceptable to Facebook. The 48-page rulebook couldn’t account for every variation of human cruelty, and the people paid to apply it were told to think like machines in situations that would break most machines.
The deeper problem—the one that no moderation team, however large, can actually solve—is that Facebook spent years engineering exactly this outcome. Its algorithms rewarded the loudest, most extreme, most emotionally activating content. They built feedback loops that made outrage the dominant currency of the platform. The people shouting hardest got the most engagement, which told the algorithm they were producing valuable content, which showed it to more people. The moderators in Berlin were deployed to manage the overflow from a system specifically designed to produce overflow.
Trying to clean that up with content teams is like fighting a forest fire with buckets of water. You get a few flames. The fire doesn’t notice. Mark Zuckerberg said his mission was to make the world more open and to connect people. What he built was the most efficient hate-amplification machine in human history, and then hired refugees at minimum wage to skim the worst of it off the top so the rest could keep running.
Simon Hurtz wrote in the Süddeutsche that the network currently contributes to reinforcing echo chambers and turning people against each other. Facebook needs to become more transparent, give users more control, and start discussing problems publicly. Otherwise 2017 will be even worse than 2016, and Zuckerberg’s vision will fail.
That was the optimistic take—that there was still time. I don’t think there was, even then. Facebook had already made its choices about what kind of platform it wanted to be, and it had made them by building systems that made those choices automatically, at scale, with no human in the loop except the traumatized workers in a grey office in northern Berlin, trying to decide whether a particular image of a bleeding child fell within the guidelines.
At some point Zuckerberg is going to have to admit—publicly, not in a year-end reflection post—that he built something that made the world measurably worse. Not accidentally, not through negligence, but through the logical operation of systems designed to maximize engagement at the expense of everything else. The content moderation operation was a confession written in labor costs: we know what’s in here, and we’ve calculated that a team of traumatized low-wage workers is cheaper than fixing it. Facebook won’t be rid of the ghosts it called up. Guaranteed.