Marcel Winatschek

Fifty Layoffs Later, Web 2.0 Is Finally Dead

For about a decade, a whole class of people told anyone who would listen that social media was the future of everything—that your own website was embarrassing anachronism, that the only sensible use of editorial skill was producing content optimized for Facebook’s algorithm, Twitter’s feed, and Snapchat’s story format. These people had job titles. They had conference slots. They had graphs.

In early 2018, Vox Media—the American media group that publishes The Verge, Polygon, and Eaterlaid off 50 social media employees. CEO Jim Bankoff cited "significant changes in the industry over the past few months." The changes he meant: Facebook had stopped surfacing publisher content in favor of posts between actual people; Twitter had become functionally a Donald Trump solo performance with comments enabled; and Snapchat had launched a redesign so universally despised it was accelerating the platform’s death rather than preventing it. Snapchat’s mistake was the same one Digg made in 2010—forcing a redesign that alienated the core user base, then acting surprised when they left and didn’t come back. Web 2.0 was officially over.

The Facebook change was the structural one. Zuckerberg’s decision to deprioritize publisher pages wasn’t magnanimity. It was an admission that Facebook had become a content delivery system rather than a social network, and that this had made the platform worse. Publishers who had rebuilt their entire traffic model around Facebook’s reach discovered again—some of them for the second time—that building on someone else’s platform is not a strategy. It’s a loan with no fixed terms and no warning on the margin call.

Vox’s layoffs were not going to be the last. Every media organization that had restructured around social distribution was going to have to take the same hit. The careers built on being a "social media expert" at a digital publisher were suddenly very exposed. The people who had spent years telling journalists that their personal websites were irrelevant were about to find out that the audience they’d herded onto someone else’s platform didn’t follow them anywhere when the platform changed its mind.

I’ve been writing online for a long time—longer than Facebook in its current form, longer than Twitter, longer than Snapchat. I watched every one of these platforms get crowned as the inevitable future of human communication, and then watched each of them eat itself from the inside. The pattern is always the same: explosive growth, platform lock-in, monetization pressure, degraded experience, exodus. Repeat. You’d think people would stop appointing saviors.

The immediate alternatives weren’t inspiring. YouTube was dealing with collapsing creator revenue after a brand-safety crisis, with monetization thresholds suddenly raised in ways that wiped out smaller channels without warning. Twitch was dealing with an increasingly arbitrary enforcement culture that was pushing streamers away. Medium, the platform that was supposed to reinvent long-form writing for the internet age, had already closed offices and laid off staff. Blogs were dead again, apparently. They’re always dead again.

The younger audiences, the ones these platforms spent years claiming to serve, had already scattered—into Instagram Stories, private Discord servers, WhatsApp group chats. Or they’d done something more radical: genuine digital withdrawal, streaming-only attention, a phone with three apps that aren’t social media. Hard to blame them. The proposition of being a captive audience for a platform that will change its terms whenever convenient isn’t attractive once you see it clearly.

There’s no accurate prediction for what comes next, and that’s probably the right outcome. We’ve exhausted our supply of platforms to anoint as inevitable. The sensible move now is just to do whatever you actually want to do—stream games, write about obscure topics, build something for a small audience—without the illusion that you’re riding a wave rather than just making something. The death of Web 2.0 felt like relief from where I was standing. The internet was briefly, again, a place where making something for its own sake was reason enough. I just hope we don’t spend the next decade running the same experiment in a different app with a new set of graphs.