Google Is Now Blocking 100,000 Search Queries Related to Child Pornography…


Under heavy governmental pressure, Google has agreed to drastically modify its search algorithms to block child pornography.  Which of course raises the question: why can’t Google also modify their algorithms to block infringing content?

Google, as part of a joint initiative announced today with Bing, will systematically block 100,000 search terms linked to child pornography.  But this more than just a bunch of blocks; it’s a massive change to the core search algorithm: according to Google chairman Eric Schmidt, Google has deployed 200 engineers to identify and systematically block offending content.  That includes finding search terms that aren’t known to the general public, and developing unique algorithms and technologies to properly remove offending content from being indexed in the first place.

Beyond that, Google will use a team of human editors to manually sift through questionable content and make judgment calls.  “There’s no quick technical fix when it comes to detecting child sexual abuse imagery,” Schmidt told the Daily Mail.   “This is because computers can’t reliably distinguish between innocent pictures of kids at bath time and genuine abuse. So we always need to have a person review the images.”

“Once that is done – and we know the pictures are illegal – each image is given a unique digital fingerprint.”

Results for offending searches on both of these sites will prominently feature warnings and resources to get help, with the remaining results reserved for related news stories (including prosecutions of offenders).  The changes, which are happening alongside a massive monitoring initiative by both British and American authorities, will initially roll out in the UK before expanding rapidly to other English-speaking Google sites.

After that, Google will roll out the blocks to roughly 150 different languages.  Which means a combination of automated algorithms and human reviewing will ultimately be deployed to the entire global network.

This is a massive effort that also involves the proactive blocking of questionable images, as well as videos on YouTube.  “It will be much harder to find that content on both Bing and Google,” Microsoft executive Nicola Hodson told the BBC.  “We are blocking content, removing content, and helping people to find the right content and also sources for help if they need it.”

On the topic of videos, Schmidt also discussed an overhaul in detection and automatic takedowns related to problematic content.  “But pedophiles are increasingly filming their crimes,” the chairman noted.

“So our engineers at YouTube have created a new technology to identify these videos.”

On the topic of why this is not happening with other forms of questionable content, like blatantly infringing content, Google communications director Peter Barron noted that child imagery is unquestionably illegal, and therefore not in a gray area like other forms of content like general photographs, music, film, or adult films.


“We’re agreed that child sexual imagery is a case apart, it’s illegal everywhere in the world, there’s a consensus on that. It’s absolutely right that we identify this stuff, we remove it and we report it to the authorities.”


23 Responses

  1. Anonymous

    “Which of course raises the question: why can’t Google also modify their algorithms to block infringing content?”

    And the answer is:

    It can! And it’s going to — if it wants to survive. The discussion is over, there are no excuses left. Google has proven once and for all how easy it is for them to block obvious criminal content.

    It is wonderful that the company finally takes the first step away from its past as the world’s leading portal to organized crime.

    Now it has to go all the way.

    • Anonymous

      … and the good news for Google is that it will be much easier to stop the pirates than it was to stop the pedophiles:

      Where Google had to hire human editors to evaluate the huge amount of gray area material related to child abuse, almost every move against piracy can be automated.

      Because Google already knows every single pirate site in the world from the millions of takedown notices it receives from its victims each month.

  2. phil

    The criteria to be used to classify words and pictures as pedophilic should be published and time allowed for the public to comment.

  3. TuneHunter

    It is very nice of them – we will reduce drastically new arrivals in American jails!

    • TuneHunter

      “…image is given a unique digital fingerprint.”

      …and yes we also need that custom digital license plate for every tune in the air!

      Actually of the air will be equally productive – tune hog/waco can become a mini-music market and peddle tunes to his social network for free music credit that will extinguish his current immoral behavior!

      Discovery Moment Monetization with real human in the loop!

      Wow always something new – I hope it is covered under Discovery Moment Media Monetization patent pending.

  4. jw

    Which of course raises the question: why can’t Google also modify their algorithms to block infringing content?

    Don’t play dumb, Paul. You’re smarter than this kind of crap.

    This photos are flagged, then reviewed & certified “pedophilia,” then given a fingerprint & removed from Google search results. There is no such thing as a fair use or licensed use of pedophilia, so they can be cataloged & removed. You can’t just catalog & remove songs or videos from the internet, it’s a licensing issue. You can’t hire people to determine whether the use of a song or video is licensed or unlicensed, nor can you automate that process. It’s far more complicated.

    This is very simple. It’s the difference between the content itself being illegal & specific uses of content being illegal.

    • David

      Google would not be ‘removing infringing material from the internet’, any more than they will be removing child porn from the internet. In both cases they would just be removing search results. Google can remove any search results they want to. They do not have to be satisfied ‘beyond a reasonable doubt’ that the material is illegal. As I’ve pointed out before, in the case of spam Google are quite willing to delete search results for known spam-producing sites even if only a ‘large fraction’ of their content is spam. They could perfectly well do the same with notorious pirate sites if they wanted to, using uncontested DMCA takedowns to identify them. Not difficult at all – they just don’t want to do it.

      There might perhaps be a misconception arising from the DMCA process for counter-notices. If a service provider receives a counter-notice, they are immune from any liability for having taken down the material that is the subject of the counter-notice, provided they restore it within a certain period. This might create the misconception that Google is somehow legally *obliged* to restore such material. But this does not follow at all. Nobody has any legal (statutory or contractual) entitlement to have material included in search results in the first place (unless of course it is an advert they have paid for), so Google is at no risk if it removes search results in error, or decides not to restore contested search results. It is purely a matter of Google policy that where copyright infringement is concerned, they will only do the bare minimum necessary to give them ‘safe harbor’ protection.

      • jw

        I’m simply making the point that removing pedophilia & removing copyright infringement are two completely different things. Pedophilia is content, whereas copyright infringement is a behavior, & the two require different methods of policing. The argument that if they can do one then they can therefore do the other holds no water, & no reasonable debate about copyright infringement can be had if the subject of pedophilia is going to be a part of it.

        You’re correct in that Google isn’t legally obligated to list any site, but it’s also not legally obligated to do anything more than respond to DMCA takedown notices as it currently does. If you’re going to justify policy based on legal technicalities, Google can just the same justify the path of least resistance. There has to be more thought behind it than that.

        You seem to be advocating a guilty until proven innocent doctrine, & I’m not sure that’s the way to handle it. Anyways, the content owners have proven time & again that they can’t handle that kind of responsibility. Remember root kits? Remember payola? There has to be some sort of intermediary keeping them honest.

        But this really isn’t the place for that conversation. The only point that needs to be made here is that pedophilia is content & copyright infringement is behavior, & you can’t automate the policing of behavior.

        • Anonymous

          “it’s [Google] also not legally obligated to do anything more than respond to DMCA takedown notices as it currently does”

          Nonsense — Google limits the number of daily takedown requests of stolen material to 250,000.

          This is an obvious violation of the DMCA and probably Google’s biggest mistake ever.

        • David

          I’m not arguing that pedophilia and copyright infringement are on a par, just that Google’s own stated reasons for not doing anything about copyright infringement are spurious. Everyone will agree that pedophilia is more serious than copyright infringement, but many people would also agree that copyright infringement is more serious than spam, where Google takes a much tougher line. And incidentally, almost every country in the world has laws against copyright infringement, so there is in fact ‘a consensus on that’.

          Your distinction between ‘content’ and ‘behavior’ isn’t clear to me. In the case of both pedophilia and copyright infringement there is ‘behavior’ which leads to the existence of ‘content’ on the internet.

          Do I want an approach of ‘guilty until proven innocent’? Yes, in cases where a website (etc) has already been found ‘guilty’ (through the DMCA process) many times. And I note also (I’m not sure how often I will have to repeat this!) that in the case of spam it is the approach Google itself takes, except that there is no DMCA process, just a private decision by Google (or its algorithms) that a site is ‘spammy’.

          I’m not justifying an approach by ‘legal technicalities’. Google itself has strongly resisted changes to legislation, such as SOPA, partly by arguing that it would be better to look for voluntary solutions. So they accept in principle that they should take action which goes beyond what is strictly necessary under current law, but when it comes to actually doing anything they don’t deliver.

    • Anonymous

      “There is no such thing as a fair use or licensed use of pedophilia, so they can be cataloged & removed”

      No, but there’s a huge amount of gray zone material consisting of family related material, art, news media etc. — which is why Google hired human editors.

      “You can’t hire people to determine whether the use of a song or video is licensed or unlicensed”

      Sure you can, 🙂 but the good news is that Google won’t have to spend a single cent on that. Because everybody — including Google — have had the required information for years. Check Google’s ‘transparency report’, lol.

  5. bjkiwi

    It seems to be all about PR and simple marketing equations to me.
    the public wants child porn removed, so google removes it .. the public wants spam removed, so google removes it .. the public wants free music and movies and doesn’t give a shit about content owners rights, so google leaves it alone.

    They only purport to care about what the public thinks because that’s what their advertising clients care about .. whatever the public finds acceptable they’ll run with, and vice-versa .. no different than old school print, radio or television .. morals and ethics don’t come into it, just like the banking system, food production or politics.

    It’s just the ruthless nature of capitalism .. hey, don’t blame me I’m just the piano player.

    • Visitor

      That’s a good point. The PR battle. It’s difficult to get the general public concerned enough about copyright infringement to really pressure Google or the Government to do much about it. In many peoples minds copyright infringement is sort of like double parking. If you ask them, they’ll say it’s wrong and people should be fined but won’t call for the big guns to come out.

      Most parents of young teens I know would freak out if they found some sort of obnoxious pornography on their kids computer but probably breathe a sigh of relief if all they found were some infringing mp3’s of a band they’ve never heard of.

      At least little Johnny is just smoking grass and downloading some Justin Bieber mp3’s. I was worried he might be hanging around with pedophiles, shooting heroin and stealing stealing cars. He’s a good boy.

      • Anonymous

        The good news is that people don’t want piracy.

        If they did, they would vote for the pirate parties. But they don’t.

        • bjkiwi

          “The good news is that people don’t want piracy”

          hmmm … not sure what rock you’ve been hiding under, but actions speak far louder than words .. people may SAY they’re against piracy, especially when it’s THEIR stuff being pirated, but the history of public use of the internet tells a much different story.

          people say they don’t want theft and crime either .. but that doesn’t stop mass looting after a storm, or insider trading or corruption in sport etc etc etc…

          • Anonymous

            “actions speak far louder than words”

            Exactly. Elections are the most fundamental actions that shape society, and the vast majority of voters around the world vote against piracy.

            If they wanted piracy, they would vote for their local pirate. That’s how stuff works in a democracy.

            Now, you may not like democracy, but you know what they say about the alternatives.

        • Visitor

          Most people don’t vote at all. The average citizen in the US thinks a pirate party is dressing up with an eye patch and wooden leg and drinking rum and diet coke until they puke on themselves.

          • Anonymous

            “Most people don’t vote at all”

            Wrong — most (more than 50%) eligble voters in the US do vote. The number is higher in a lot of other countries.

            Still, the vast majority just don’t vote for pirates.

            Wonder why… 🙂

  6. Daniel

    Wait. Why is it Google’s responsibility to remove pages that have copyright infringement? And how would they do that across the board? Every time someone plagiarizes someone else, they should wipe that too, right? All examples of illegal activity should be removed from search. Google shouldn’t link to pages that defame someone. And they should know how to find this content before there is a complaint. Perhaps we need to make sure no content that encourages trespassing or jaywalking comes up in Google searches. And certainly, Google should make certain that no threats are linked to in searches.

    Oh yeah, and Google should remove sites that censor people.

    Tools. So many music industry tools.

    It’s child abuse, and it’s very damaging. And it’s not on par with your complaints. It’s disturbing that anyone would fail to see this.

    • Anonymous

      “Why is it Google’s responsibility to remove pages that have copyright infringement? And how would they do that across the board?”

      Don’t be silly, it’s easy for Google to go legit.

      How long did it take YouTube? A year?

      After 15 years of international protests, and a long time after their competitors, Google finally decided to stop their service for pedophiles. Don’t think for a second that the world is going to wait another 15 year for them to stop their piracy.

      The world just can’t afford the current version of Google anymore.

      • bjkiwi

        I agree with Daniel.

        as for Anonymous, wake up and smell the rubbish! .. it’s easy to stop the fighting in the Middle East too, just put the guns down and be nice to each other.
        “The world just can’t afford the current version of Google anymore” .. really?? I’ll think you’ll find the opposite is true

  7. Kelly

    This is heading in the rite direction. Isn’t there some way for the internet to recognize child porn and it automatically contact the cops and know where it came from.


Leave a Reply

Your email address will not be published.

Verify Your Humanity *