Despite its promises, YouTube has failed to take down hate-filled and extremist content in an adequate time-frame.
In a heated exchange with another gamer, YouTube star PewDiePie shouted the n-word. An underage girl is forced to undress in front of the camera. A model described being bullied for having dark skin.
What’s the difference between these three YouTube videos? The first two earned money for their content; the third did not.
Several months ago, YouTube vowed to demonetize videos with hate-filled content. Last month, the company also promised to introduce a system, making it easier for content creators to contest videos deemed offensive.
YouTube has yet to deliver on its promise.
While the company demonetizes not-for-profit projects, child exploitation videos still readily receive money.
In an exclusive report, The Sun found that the platform has placed ads next to child exploitation videos.
One video shows a young Russian girl forced to undress. An adult dressed in a killer clown costume then proceeds to inject her with a giant-sized needle.
In another video, the clown continually drugs a child before tying him up.
According to The Sun, the clips were dramatized. Yet, they still appeared “extremely disturbing.”
Google took down the videos only after The Sun reported them.
Matan Uziel brought the videos to The Sun’s attention. He founded the Real Women Real Stories channel on YouTube. He designed the not-for-profit project to promote awareness of the “often unseen hardships women face in different professions and places.”
On Matan’s YouTube page, you can find videos of women discussing various hardships they face daily.
In one video, a black model speaks out against skin bleaching. The channel also has a video from famous Mexican actress Kate Del Castillo. She discussed physical abuse in marriages.
As part of their demonetization campaign, YouTube removed ads from Real Women Real Stories’ videos.
Uziel told The Sun Online,
“To say that sexual abuse and rape is ‘out’ is terrible for women who want to bring their stories forward and can’t because we simply don’t have the money.
“YouTube was our only source of income and now our channel is nearly dead.”
In March 2016, the channel earned around $2,149. Last June, they only received an estimated $11.
In a blog post, a spokesman for Google wrote,
“To help creators better understand how they can make money from their stories, we have publicly available guidelines which explain the types of videos where we don’t allow ads to run.
“If a creator feels their videos should be monetized, we also provide a simple form for them to appeal the decision.”
It appears that YouTube has yet to follow through on this, at least with Matan Uziel.
You can flag terrorist and neo-Nazi content on the platform. Yet, YouTube may not take them down, at least not right away.
To fight online terror, Google’s General Counsel, Kent Walker, made a pledge. He said that the company would take four steps to fight terrorism online.
“Terrorism is an attack on open societies, and addressing the threat posed by violence and hate is a critical challenge for us all. Google and YouTube are committed to being part of the solution.”
He also made a fantastic statement.
“We have used video analysis models to find and assess more than 50% of the terrorism-related content we have removed over the past six months.”
Yet, you can still find hundreds of extremist and neo-Nazi videos on the platform. That’s according to The Henry Jackson Society, a UK think tank.
Researchers found one video titled ‘Adolf Hitler Was Right.’ The video readily praised the Nazi leader. It also showed Jewish families taken to concentration camps.
Researchers also discovered a video of a child singing in the background as footage exalted terrorism. Another video shows a Muslim teenager being attacked. The man who attacked the teen called him “Isis scum.”
In addition, The Henry Jackson Society easily found Taliban propaganda on YouTube.
Even more interesting, users had already flagged many of the terrorist and neo-Nazi videos. The video platform simply hadn’t taken most of them down.
Labour MP Yvette Cooper commissioned the study. The chair of the Home Affairs Select Committee, Cooper denounced the content on the video platform. She called the delay in offensive video takedowns “unacceptable.”
Speaking with The Independent, Cooper said,
“Whether that’s Islamic extremism or far Right extremism, the reality is that this material is far too easy to access.
“We know social media can play a role in the radicalization of young people, drawing them in with twisted and warped ideology.
“YouTube have promised to do more, but they just aren’t moving fast enough.”
According to The Independent, the UK think tank came across sixty-one flagged far-right videos and sixty Islamist videos. As of this story publishing, YouTube had already removed dozens of the flagged videos.
Speaking with The Independent, Dr. Alan Mendoza underscored the severity in allowing terror and neo-Nazi content on the site. The executive director of the Henry Jackson Society, Mendoza said that the internet impacted offenders’ engagement with extremism.
“These ideologies can be freely disseminated and amplified online and there is room for improvement by technology firms to provide spaces to expose and debate their inconsistencies.”
So, should platforms like YouTube receive punishment for not quickly taking down extremist content? Yvette Cooper believes so. She called on the UK government to introduce “proper penalties and fines for social media companies who do not act swiftly enough to remove dangerous and illegal content.”
Cooper isn’t alone. The Policy Exchange, another UK think tank, found that over 75% of the British public want internet companies to do more “to find and delete content that could radicalize people.”
The Policy Exchange also found that people in the UK accessed jihadi content more frequently than elsewhere in Europe. Globally, the country ranked in fifth place behind Turkey, the US, Saudi Arabia, and Iraq.
YouTube readily defended itself from these claims. A spokesperson for the company said,
“Through new uses of technology, the majority of videos we removed for violent extremism over the past month were taken down before receiving a single human flag. We’re doing more every day to tackle these complex issues.”
The company also vowed to be “part of the solution” to extremism.
Image by The Israel Project (CC by 2.0)