What Facebook Keeps
In late December 2016, a twelve-year-old girl named Katelyn Nicole Davis livestreamed her suicide for forty minutes on Live.me, a platform connected to Facebook. She’d been abused by a family member. Her parents took down the stream immediately, but it was already spreading. YouTube removed it when asked. Facebook refused.
For weeks the video circulated—shared, reposted, quantified with likes and emoji reactions. Her parents kept asking Facebook to take it down. Other users reported it. Facebook said it didn’t violate their community standards. Technically not illegal, they said, as if that mattered.
What stays with me is the contradiction. Facebook bans topless pictures. They’ll remove a woman’s breasts in the name of community values. But a video of a girl killing herself? That’s content Facebook is comfortable with. That can spread. That gets engagement metrics.
I keep thinking about the people who made that decision. They weren’t being cruel. The system just organized itself so that nudity is a bigger problem than suicide footage. The policy logic was sound. That’s what’s unsettling about it.
The police posted on Facebook asking people to stop uploading it, appealing to respect for Katelyn. But the platform that’s actively hosting it doesn’t have to listen.