Under heavy governmental pressure, Google has agreed to drastically modify its search algorithms to block child pornography. Which of course raises the question: why can’t Google also modify their algorithms to block infringing content?
Google, as part of a joint initiative announced today with Bing, will systematically block 100,000 search terms linked to child pornography. But this more than just a bunch of blocks; it’s a massive change to the core search algorithm: according to Google chairman Eric Schmidt, Google has deployed 200 engineers to identify and systematically block offending content. That includes finding search terms that aren’t known to the general public, and developing unique algorithms and technologies to properly remove offending content from being indexed in the first place.
Beyond that, Google will use a team of human editors to manually sift through questionable content and make judgment calls. “There’s no quick technical fix when it comes to detecting child sexual abuse imagery,” Schmidt told the Daily Mail. “This is because computers can’t reliably distinguish between innocent pictures of kids at bath time and genuine abuse. So we always need to have a person review the images.”
“Once that is done – and we know the pictures are illegal – each image is given a unique digital fingerprint.”
Results for offending searches on both of these sites will prominently feature warnings and resources to get help, with the remaining results reserved for related news stories (including prosecutions of offenders). The changes, which are happening alongside a massive monitoring initiative by both British and American authorities, will initially roll out in the UK before expanding rapidly to other English-speaking Google sites.
After that, Google will roll out the blocks to roughly 150 different languages. Which means a combination of automated algorithms and human reviewing will ultimately be deployed to the entire global network.
This is a massive effort that also involves the proactive blocking of questionable images, as well as videos on YouTube. “It will be much harder to find that content on both Bing and Google,” Microsoft executive Nicola Hodson told the BBC. “We are blocking content, removing content, and helping people to find the right content and also sources for help if they need it.”
On the topic of videos, Schmidt also discussed an overhaul in detection and automatic takedowns related to problematic content. “But pedophiles are increasingly filming their crimes,” the chairman noted.
“So our engineers at YouTube have created a new technology to identify these videos.”
On the topic of why this is not happening with other forms of questionable content, like blatantly infringing content, Google communications director Peter Barron noted that child imagery is unquestionably illegal, and therefore not in a gray area like other forms of content like general photographs, music, film, or adult films.
“We’re agreed that child sexual imagery is a case apart, it’s illegal everywhere in the world, there’s a consensus on that. It’s absolutely right that we identify this stuff, we remove it and we report it to the authorities.”