What Facebook Chooses to Keep
For two weeks in January 2017, a video circulated online of a twelve-year-old girl hanging herself. Katelyn Nicole Davis had streamed it live on a platform called Live.me on December 30th, 2016—a forty-minute clip that ended with her death. She had been sexually abused by a family member. The video went viral.
YouTube took it down when asked. Facebook didn’t. The company’s position was that the video didn’t violate its community standards, and technically they were right, because Facebook’s standards at the time could accommodate a child’s death but not, say, exposed nipples. The Polk County Police Department eventually posted on its own Facebook page asking users to remove any copies out of basic respect for Katelyn and her family.
The question that keeps nagging me isn’t legal. It’s structural. Facebook had built a system where you could react to a child’s death with a like, a smiley face, a comment. Where the death of a twelve-year-old could accumulate engagement metrics. And when asked to stop it, they pointed to the rulebook and shrugged.
There’s a particular ugliness in a corporation deciding that a child’s suicide doesn’t conflict with its values. It’s not quite indifference—it’s something more calculated. The shocking content keeps people on the platform. The grief and outrage keep people on the platform. Everything generates engagement. Katelyn Nicole Davis became, briefly, content.
The internet doesn’t forget. Facebook, in that moment, decided that was fine. That’s what the rulebook said. That’s what they went with.