What happens when Google thinks for you?
The Google Books Settlement marks an end of self-directed inquiry and the beginning of self-referred intelligence, where our thoughts become just artifacts of Googlified experience.
Brookville, N.Y.
When our 7-year-old wants to express the idea of infinity, he uses the word Googol. “I love you,” he says. “I love you more,” I say. “I love you a Googol,” he says; and then trumping any further adulation adds, “No! A Googol times a Googol.”
With all that Googoling, I can't help wondering what a “Googol times a Googol” would look like.
Perhaps US District Court Judge Denny Chin knows. He is expected to rule on whether Google has committed copyright infringement sometime this summer.
For now, the Google Books Settlement over Google's effort to digitize the world’s books has authors, publishers, and lawyers reeling, but the enduring, afflictive dangers of this decision extend well beyond the rights of access, fair use, and reader privacy.
Google is paving the way toward a new definition of thought, engaging more complex dimensions of human cognition, perception, and aesthetic preference than ever before.
Forget for a moment the many perils inherent in entrusting a huge corpus of the world’s recorded knowledge to a single corporation and registry.
Think about how you, personally, experience information.
Chances are – if you’re reading this – you go to Google’s simple interface (the epitome of usable elegant design) and type in a word or part of a word.
It’s likely you’re amiss on the spelling and the linguistic map, “Did you mean: _____” pops up in red. You’ll click on the suggested term (because of course that is what you meant) and average about 84,500 results in 0.29 seconds.
The top 10 results display first, but rarely will you navigate to page 2. If the answer’s not a couple clicks away, you’ll ask a different question.
As a librarian, I see this all the time. The act of research – literally, “re-search” – is dying.
We search only once. We bring very little to the table. We’re not so proficient at input, and we don’t remember specifics. We aim for proximity and trust the technology to reconcile our query with the infinite repository of information and ideas we now call Google.
Even with Judge Chin’s decision pending, this habit is terribly dangerous.
Why? The human mind is a field of information.
When we encounter a piece of information, our mind remembers and correlates it with our experience. This is what scientists call “intelligence.” Intelligence is, basically, information.
Google is fast becoming the single lens through which we perceive intelligence, displaying extraordinary qualities that the human intellect doesn’t come close to. One of these is infinite correlation, or the ability to do an infinite number of things at the same time and correlate these with each other.
When we pursue an idea online, correlation takes place below the level of our conscious awareness. A search in Google Books for “A Tale of Two Cities,” for example, renders associative trails of ubiquitous bread crumbs that extend far beyond literary experience.
Each unit of text – from Darnay to guillotine – is scanned, deconstructed, and remixed into Google’s universal computational cloud. Dickens’s story is parsed and recalibrated to abysmal enterprise: think commercial (buy a knitted scarf), ethereal (storm the Bastille), colloquial (chat with Jacobins), financial (invest at Tellson’s), etc.
Infinite correlation is just one specific cognitive role of Google’s technology. It dislodges time and place as key aspects of organizing and finding information. These dislocations are a real bummer for those of us concerned with posterity and public trust, because Google is not a benign tool – it both shapes and uses us to create information.
With its advertising, data mining, and codified persuasions, the Google Books Settlement marks an end of self-directed inquiry and the beginning of self-referred intelligence, where our thoughts (and fictions) become just artifacts of Googlified experience.
“The aim of science is not to open the door to infinite wisdom,” said German poet Bertolt Brecht, “but to set a limit to infinite error.”
By redefining readership, Google has slammed the door on the fundamental idea of authorship.
Perhaps, in time, these identifiable literary entities are destined to be just fragments of the One Big Global Book. Yes, if we are to cherish democracy in its purest form, we must swiftly get over this $125 million copyright quibble and make a concerted effort to think (and read) outside the cloud. Because a “Google times a Google” is a kind of OneBox omniscience – uniform, massive, redoubtable – from which our stories will either fade or flourish.
Emily Walshe is a librarian and professor at Long Island University in New York.