YouTube is losing advertising from maker Epic Games, Disney, and other companies because of ads appearing alongside videos shared by pedophiles.
YouTube told Ars that it has taken action against users violating its policies this week, including by terminating more than 400 channels, deleting accounts, and disabling comments on tens of millions of videos.
“All Nestle companies in the US have paused advertising on YouTube, a spokeswoman for the company said Wednesday in an email,” Bloomberg reported yesterday. “Video game maker Epic Games Inc. and German packaged food giant Dr. August Oetker KG also said they had postponed YouTube spending after their ads were shown to play before the videos. Disney has also withheld its spending, according to people with knowledge of the matter, who asked not to be identified because the decision hasn’t been made public.”
“Wormhole into a soft-core pedophilia ring”
The companies pulled advertising days after YouTuber Matt Watson posted a video detailing what he calls “a wormhole into a soft-core pedophilia ring on YouTube.”
“YouTube’s recommended algorithm is facilitating pedophiles’ ability to connect with each other, trade contact info, and link to actual CP [child pornography] in the comments,” Watson reported. “I can consistently get access to it from vanilla, never-before-used YouTube accounts via innocuous videos in less than ten minutes, in sometimes less than five clicks.”
To access the “wormhole,” Watson first searched for “bikini haul” and then clicked on recommended videos in YouTube’s “Up next” section. Within a few clicks, he was led to videos of young girls. The initial videos weren’t meant to be pornographic, but users in the comments section posted time stamps to “the points in the video where little girls are in compromising positions,” Watson explained.
“Once you enter into this wormhole, there is no other content available,” Watson said in his video, which has racked up nearly two million views. “YouTube’s algorithm is glitching out to a point that nothing but these videos exist. So this facilitates the pedophiles’ ability to find this content, but more importantly… trade social media contact info, and I have also found links to child pornography. And of course, there is advertising on some of these videos.”
An extensive story published yesterday also provided details on the problem.
“Videos of children showing their exposed buttocks, underwear and genitals are racking up millions of views on YouTube—with the site displaying advertising from major cosmetics and car brands alongside the content,” wrote. “Comments beneath scores of videos appear to show pedophiles sharing timestamps for parts of the videos where exposed genitals can be seen, or when a child does the splits or lifts up their top to show their nipples… The videos are also being monetized by YouTube, including pre-roll adverts from Alfa Romeo, Fiat, , Grammarly, L’Oreal, Maybelline, , Peloton and SingleMuslims.com. Banner advertising for Google and the World Business Forum also appeared alongside some of the videos.”
YouTube details actions against users
When contacted by Ars, a spokesperson for YouTube said, “Any content—including comments—that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling comments on tens of millions of videos that include minors. There’s more to be done, and we continue to work to improve and catch abuse more quickly.”
YouTube, which is owned by Google parent Alphabet, said that in the past 48 hours it has removed thousands of inappropriate comments on videos of minors, terminated more than 400 channels for comments they left on videos, reported illegal comments to the National Center for Missing & Exploited Children, and removed dozens of videos that put young people at risk despite being posted with innocent intentions.
In the meantime, YouTube continues to face pressure from advertisers. “Through our advertising agency, we have reached out to Google/YouTube to determine actions they’ll take to eliminate this type of content from their service,” Epic Games said, according to CNBC.