Five Clicks
Five clicks. According to Matt Watson—a YouTuber who spent weeks mapping what he found—that’s all it takes from an ordinary search to end up deep inside a network where men leave timestamped comments on videos of children, marking the exact seconds where a girl is in an exposed position or makes a movement they find useful. Not on a dark-web forum, not behind a paywall. On YouTube, running alongside ads for household brands.
Watson published a twenty-minute video in early 2019 laying out how the recommendation algorithm functions as a wormhole. Search certain terms, click a video of a young girl doing gymnastics or dancing in her bedroom, and the autoplay queue begins adjusting. Each click narrows the feed. The comments on these videos operated as a distribution network—men sharing timestamps, cutting the clips, uploading compilations to more obscure platforms. The children and their parents, who ran the channels, had no idea their footage was being circulated this way. The video was viewed nearly two million times before the wider press caught up with it.
Disney, Nestlé, and Epic Games—the studio behind Fortnite—pulled their advertising within days. This was the second time something like this had happened. In 2017, a similar pattern surfaced—sparsely clothed children, predatory comment threads—and Adidas, Amazon, and Deutsche Bank paused their spend, YouTube pledged action, the news cycle moved on. The problem evidently had not been fixed.
YouTube’s statement was the standard language: abhorrent content, clear policies, immediate action, illegal activity reported to authorities, comments disabled on millions of videos featuring minors. Comments disabled. Not the videos removed. Not the recommendation logic restructured. The comments were the visible symptom, and they treated the symptom.
The deeper issue is structural. The recommendation algorithm is designed to maximize watch time, and watch time is watch time—the system has no mechanism for distinguishing between a predator spending four hours in a queue and anyone else spending four hours in a queue. Both register as good numbers. Fixing this requires accepting worse numbers for non-commercial reasons, and everything about how Alphabet handled this twice now suggests they won’t do that until advertiser pressure makes the alternative more expensive than the fix.