Google, Microsoft child pornography omissions only 'a part of the fight'

Tech giants Google and Microsoft have made headlines for cleaning up search results to deter searchers from child pornography sites. However, this decision highlights the often contentious decision to censor online content, and likely is only a small part of the fight against child exploitation.

Google search.

November 19, 2013

Google and Microsoft took unprecedented steps this week to combat child exploitation, announcing both companies would be omitting more than 100,000 terms from their search algorithms that could potentially lead to child pornography. The two companies will be including a warning message and helpline number on an additional 13,000 terms in hopes of deterring further searching of illicit content.

"While no algorithm is perfect – and Google cannot prevent pedophiles adding new images to the web – these changes have cleaned up the results for over 100,000 queries that might be related to the sexual abuse of kids," writes Google executive chairman Eric Schmidt in a Daily Mail op-ed.

Despite widespread public disdain for child pornography, omitting search terms is a marked step away from the free Internet Mr. Schmidt has preached many times before. However, this isn’t the first time that Google or Microsoft has edited its algorithms to bypass offensive content, and follows a pattern of whack-a-mole solutions to savvy "dark Web" content creators.

Democrats begin soul-searching – and finger-pointing – after devastating loss

In 2009, Google released Google Instant, the now standard search function that predicted what users were likely to search, and offered a results page as they typed their inquiry. However, since words such as ‘assimilate’ will bring up wholly different results whether you just type in the first three letters or all ten, Google set aside a certain set of contentious words that you would have to hit ‘search’ before seeing the results.

Other websites have pushed their "not safe for work" (NSFW) content away from search engine results all together. In July, the notoriously adult-content friendly blogging website Tumblr began editing the searchability of its NSFW and adult-content blogs. The site said that adult blogs, which “contain substantial nudity or mature/adult-oriented content,” would not show up on Google, Yahoo, or Bing searches. This started soon after Yahoo, who’s chief executive said not all Tumblr’s content was “brand safe,” bought the site.

However, after backlash from Tumblr users, later in July the site re-merged adult and NSFW content and restored the sites to search engine indexes.

Aside from pornography, tech companies have had to edit search results and available material depending on geography and policy. China has long had a contentious relationship with Google, requiring terms such as “Tiananmen Square 1989” to be omitted from search terms and eventually pushing Google to move its search engine domain to Hong Kong (which isn’t under Chinese propaganda jurisdiction). In Germany and France, where there are strong laws against religious persecution and hate speech, Google omits pro-Nazi and Holocaust-denial websites (but links to the website Chilling Effects at the bottoms of the search page, which lists censored Internet websites). Ebay decided in May to not allow the sale of any Nazi or hate material, even if it has historical significance.

The most recent changes at Google and Microsoft, however, put the power more in the hands of search engines. Schmidt was clear that changing search terms is far from the only step needed to combat child pornography. In his op-ed, he pointed out that many photos can easily circumvent search terms, and must be looked at individually by people who can judge content.

They took up arms to fight Russia. They’ve taken up pens to express themselves.

For example, take people at the National Center for Missing and Exploited Children (NCMEC). John Shehan, executive director of the exploited child division at NCMEC, says they look at each individual image sent to its database, CyberTipLine.com, to determine whether something is pornography or not. They then record the photos hash value (essentially a picture’s digital fingerprint), which NCMEC can send to search engines such as Google and Microsoft to omit from searches. Currently, it has been able to provide Internet companies with more than 20,000 hash values to avoid.

This technology was developed by Microsoft, who just opened up the doors to its 16,800-square-foot Cybercrime Center at its Redmond, Wash. headquarters last week, focused on using tech to combat digital deviance.

“Microsoft has a zero tolerance approach to child sexual abuse content. If society is to stamp it out, then together we need to tackle the core problems of creation, distribution and consumption," says a Microsoft spokesperson in an e-mail.

However, Mr. Shehan points out that more sophisticated child pornography viewers tend to use peer-to-peer file sharing, so keyword changes will only catch a certain type of Internet browser.

“Some users who are in the beginning stages may use keywords,” he says. “But it certainly will not be the silver bullet to solve this problem.”

Despite tech companies’ efforts to police controversial content, he says access is only one part of the battle against any social ill, whether it be anti-Semitism or child pornography.

“The World Wide Web is just a small piece of the fight, this will more have an effect on how search engines do their business,” Shehan adds. “This is a step in the right direction, but [the fight] will be more than just Google and Microsoft.”