First, YouTube said that they promised to work on demonetizing extremist and hate-filled videos. Now, the popular video platform has to deal with tens of thousands of active predatory accounts preying on innocent children.
Earlier this year, following a bombshell report by The Times of London, major companies pulled their advertising money from YouTube. The Times discovered that advertisements from the UK government, L’Oreal, and the Royal Navy, among many others, played alongside extremist, hate-filled videos.
Facing a severe backlash over the company’s failure to flag and take down questionable content, YouTube introduced several notable updates. One of them was a change in how the platform monetized videos.
Immediately following the change, hundreds of content creators complained about losing revenue. The platform, which reportedly pays around $0.0007 per video play, started demonetizing a lot of videos. Dubbing the change ‘Adpocalypse,’ top YouTube stars jumped ship to other websites, including Patreon. While still maintaining a channel on YouTube, PewDiePie opened a channel on Amazon’s Twitch platform. Controversial Singapore YouTube star Amos Yee told his followers,
“Now more than 50 percent of the current videos on my channel have been demonetized, and because future videos of mine will obviously deal with controversial political subjects and [have] vulgar language, there’s a very high chance that more than 50 percent of my videos will continue to be demonetized and I won’t make money [off] them.”
Despite the flood of complaints from advertisers and creators, YouTube that said the company had made “considerable progress.” YouTube’s EMEA Video Strategy Director, Dyana Najdi, explained in an interview,
“Since August, 83% of the content that was removed for violent extremism was taken down before a single human flag was raised. This is eight percentage points better than July. We’re making progress quickly. Our systems have been refined to be more precise and more surgical in enforcing our policies.”
Now, following another devastating report, content creators have braced for a second ‘Adpocalypse.’ This time, it may be worse than the original.
Adpocalypse Now: Part II.
For months, parents have complained about questionable content on the company’s YouTube Kids app.
Unsuspecting children would watch videos with popular cartoon characters like Peppa Pig subject to a violent dental session. In other videos, children would learn how to tie each other up and wear vulnerable clothing. They would also watch famous cartoon characters drinking bleach.
In another bombshell report, The Times reported last week that YouTube had monetized videos of ads next to videos of young children. In one video, a scantily-clad girl would roll on a bed filled with teddy bears and other plush dolls. The videos would draw predatory comments from pedophiles.
One commenter wrote,
“Little girl, you are a wonder. I would like to kiss your fragrant panties.”
In addition, The BBC and The Times both found that the video platform has yet to shut down ten of thousands of predatory accounts. YouTube’s volunteer moderators told the BBC that “between 50,000 to 100,000 active predatory accounts still [exist] on the platform.”
Pedophiles would reportedly search for “certain keywords in Russian that can bring up hundreds of young Slavic girls.”
Advertisements for BT, Adidas, Cadbury, Deutsche Bank, eBay, Amazon, Mars, Diageo, and Talktalk would appear right before the videos. Pedophiles would search for videos of young girls filming themselves “in their underwear, doing the splits, brushing their teeth, or rolling around in bed.”
Following the reports, major advertisers lambasted the popular video platform.
A spokesperson for Mars said,
“We are shocked and appalled to see that our adverts have appeared alongside such exploitative and inappropriate content. We have taken the decision to immediately suspend all our online advertising on YouTube and Google globally. Until we have confidence that appropriate safeguards are in place, we will not advertise on YouTube and Google.”
In a statement, supermarket chain Lidl said,
“It is completely unacceptable that this content is available to view and it is, therefore, clear that the strict policies which Google has assured us were in place to tackle offensive content are ineffective. We have suspended all of our YouTube advertising with immediate effect.”
Deutsche Bank added,
“We take this matter very seriously and suspended the advertising campaign as soon as we became aware of it.”
A Cadsbury spokesperson said,
“Whilst we investigate this matter we have suspended all advertising on the channel until we have clarity from YouTube on how this situation occurred and are satisfied that an acceptable solution has been put in place.”
YouTube has since commented about the massive advertiser fallout. Vice President of Product Management, Johanna Wright, promised that the platform would better police questionable comments.
“Comments of this nature are abhorrent and we work with NCMEC to report illegal behavior to law enforcement. Starting this week we will begin taking an even more aggressive stance by turning off all comments on videos of minors where we see these types of comments.”
With the huge fallout over questionable underage sensual videos, expect YouTube to introduce yet even more changes. As with the changes following the first ‘Adpocalypse,’ the move will likely harm content creators once again.
Featured image by _Gaspard_(CC by 2.0)