Marcel Winatschek

Someone Has to Watch

The Süddeutsche Zeitung investigated Facebook’s content moderation operation in Berlin and talked to the people doing the work. Most are Syrian refugees working jobs they can’t get anywhere else—their qualifications don’t count here. They make just above minimum wage to spend shifts looking at reported posts they can’t preview. Beheaded bodies. Child abuse material. Nazi content. Whatever comes in the queue.

One moderator described a video she had to watch. A man, a small child, a butcher’s knife. She has a kid that age. She couldn’t finish the shift. Grabbed her bag and left crying for the streetcar. That detail stays with me—not as a rhetorical point about why Facebook is bad, but as a fact about what one person had to carry with them.

The moderation rulebook is 48 pages and contradicts itself, either on purpose or worse—suggesting no one thought through what would happen. Violence against refugees gets deleted, but calling them animals usually doesn’t. Decapitated bodies often stay up. A breast gets your account suspended permanently. Moderators learn quickly that you get blamed more for deleting something that shouldn’t be deleted than for letting something terrible stay. So they let it stay.

The system isn’t broken. It’s working exactly as designed. You pay people desperate enough that they can’t refuse. You give them rules complicated enough that they’ll make predictable errors. You optimize those errors toward keeping content up. Everyone involved knows this. Everyone accepts it because the alternative is worse—for Facebook, you just hire replacement workers. For the moderators, the alternative is no job, or something worse.

There’s a kind of exhaustion in understanding this clearly. Not anger, exactly. More like watching a machine function with absolute precision and realizing there’s no version of this that ends differently. The company benefits from the setup. The algorithm optimizes for engagement, which means amplifying rage and tribalism. The moderators need work. So the system persists, rotating new people through offices in Berlin, watching the same videos, making the same impossible choices.

What I can’t stop thinking about is the moment she decided it wasn’t worth it. Where a video of a child and a knife made her realize some things cost more than money. That’s the human limit the system wasn’t designed to account for. Except it was, because once she quit, Facebook just hired someone more desperate. The system accounts for people breaking. It’s built in.