Is Google's new chip a game changer for AI?

Possibly, but likely not yet. While AI has affected everything from cars to languages, companies are still reliant on existing chips.

South Korean professional Go player Lee Sedol reviewed a match against Google's AI program AlphaGo, in Seoul, South Korea in March. On Wednesday, Google revealed that its software's victory over Mr. Sedol was aided by new chips it has developed to enhance its machine learning software in everything from searches to self-driving cars.

Lee Jin-man/AP/File

May 19, 2016

In the arms race between Silicon Valley giants to develop faster and more complex artificial intelligence capabilities, Google has a secret weapon: It's developing its own chips.

At a conference for developers on Wednesday, chief executive Sundar Pichai said the tech giant had designed the chip, which the company says it's been using for over a year, specifically to improve its deep neural network.

These networks are the brains that "learn" over time to to power features such as Gmail's "Smart Reply," and the ability to tag people in photos and search by voice. The chips were also in place when Google's AlphaGo computer program beat Go champion Lee Sedol in March, although the company didn't announce it at the time. 

Why many in Ukraine oppose a ‘land for peace’ formula to end the war

As companies have increasingly focused on building tools that use machine learning as a backbone, they've also branched out into creating their own chips instead of purchasing them from major vendors, such as Nvidia.

Nvidia's own GPUs, or graphics processing units, are often used to render high-resolution video images, but have also been used for machine learning, including in early tests of the AlphaGo program, The Wall Street Journal reports.

Google's own chip, which it's calling a Tensor Processing Unit (TPU), moves its technology "seven years into the future," its distinguished hardware engineer Norm Jouppi wrote in a blog post. That's three generations of Moore's law, which holds that computers' processing power will double every two years.

But the companies aren't wholly replacing existing processors by companies such as Intel, according to Mark Horowitz, a professor of electrical engineering at Stanford University.

Instead, he told the Journal, these chips offer additional help to companies such as Google and Microsoft grow. Microsoft has been using a type of programmable chip called Field Programmable Gate Arrays to improve its hardware for machine learning, which typically requires a large amount of computing power. 

Howard University hoped to make history. Now it’s ready for a different role.

The tools are then used for everything from search engines to powering Google's self-driving cars.

"TPU is tailored to machine learning applications, allowing the chip to be more tolerant of reduced computational precision, which means it requires fewer transistors per operation," Mr. Jouppi wrote on Wednesday. "Because of this, we can squeeze more operations per second into the silicon, use more sophisticated and powerful machine learning models and apply these models more quickly, so users get more intelligent results more rapidly." 

Last year, Google released Tensor Flow, the software engine that powers its machine learning systems, free to the public via an open-source license. Jouppi told the Journal the company eventually hopes to make the chips available as part of its Google Cloud service, which has previously released some of its machine-learning applications.

Apple also began creating its own chips in 2009 to improve the processing power and efficiency of its devices and add new features.

But while Google's chip is helping improve its machine learning tools, the company likely isn't in a position to abandon GPUs and processors made by other companies entirely, Patrick Moorhead, an analyst at Moore Insights & Strategy, told PCWorld.

"It's not doing the teaching or learning," he said of Google's TPU. "It's doing the production or playback."

Google's chip is an application-specific integrated circuit, or ASIC, a hard-wired chip that can't be reprogrammed if a user needs to make changes, though it provides larger benefits in performance. That makes them costly, restricting their use to governments and other organizations with budgets to afford them, PCWorld notes.

But the chips, which can fit into a standard hard drive slot on Google's servers, have helped the company make large-scale improvements. It began using the TPU last April to help its StreetView software better process images, Jouppi told the Journal, speeding up the processing time for all of its images to just five days.