YouTube Advertisers Were Warned About Extremist Videos Months In Advance – And Did Nothing

YouTube Advertisers Were Warned About Extremist Videos Months In Advance - And Did Nothing

Advertisers knew that their ads would appear next to extremist videos.  Did they also know that uploaders would receive payments from YouTube, as well?

Several months ago, The Times published a damning report that would set-off a firestorm.  According to an investigation by the UK newspaper, ads from top advertisers appeared in extremist videos on YouTube.

The fallout was immediate.  Advertisers, including Toyota, Johnson & Johnson, Verizon, and AT&T, boycotted the popular video platform.  The UK government demanded an immediate response from parent company Google.  The web giant also bled millions of dollars in lost revenue.  And, content creators quickly lost much-needed cash to support their channels.

No one could believe that Google and YouTube would allow extremist and hate-filled videos to remain active on the video platform.  Even worse, uploaders readily received revenue from the advertisements.

But, did you know that advertisers had known about this problem months earlier?

A massive fallout predicted.  But why didn’t anyone listen?

Speaking at a Google press event, Stephen Loerke, Chief Executive of the World Federation of Advertisers (WFA), confirmed that advertisers knew about the problem a year-and-a-half ago.

According to Loerke, an expert had told the global trade association that ads “might run against extremist content” on YouTube.  This came six months before The Times broke the story.

We brought an expert, the most knowledgeable person on the subject at the time, and after his presentation, there was just silence.  The only question from the audience was ‘what can we do so when this breaks we can say what we did?

The WFA then released a report and told member companies about the warning.  While many advertisers “took action,” Loerke said that the crisis ultimately “lost momentum.”

The companies that were aware took action.  When The Times story broke, none of those we briefed were among those.  When you don’t have it in the newspapers, it seems quite abstract.

Loerke didn’t say which companies had received the warning and taken action.

But prior to The Times’ piece, Google and YouTube had long known about the problem.  They had actively worked on solutions.  Yet, YouTube’s EMEA Video Strategy Director, Dyana Najdi, acknowledged that the company hadn’t solved the problem “fast enough.”

The volume of impressions [of ads that ran against extremist content] were so small… but it takes just one impression to lose an advertiser’s trust.  This was something we learned and it is non-negotiable that we get this right.

Najdi added that Google and YouTube have since made “considerable progress.”

Since August, 83% of the content that was removed for violent extremism was taken down before a single human flag was raised.  This is eight percentage points better than July.  We’re making progress quickly.  Our systems have been refined to be more precise and more surgical in enforcing our policies.

Yet, not everyone is ready to advertise on YouTube again.

WFA members have actively worked with Google.  Loerke admitted, however, that not every advertiser is ready to trust the company.

We [WFA] have had frequent contact with Google, briefing us on steps they have taken in a tone of humility that was very welcomed by brands… I still know a number of companies that have not gone back.  Some others returned to YouTube but are demanding their agencies do all they can to control risk.

Last July, multiple companies and organizations told the Financial Times that they have yet to return to the platform.

Loerke concluded,

Ultimately, the question is, ‘Is the risk totally eliminated?’  It’s hard to say that’s even possible and some companies are not prepared to take any risk.

 


Featured image by PragerU (YouTube screengrab)

One Response

  1. Fight for Freedom

    Speaking of “nothing” that is the amount of credible unbiased information on DMN.