Google Says Its Tensor-Powered AI Supercomputer Is Faster Than Nvidia

0
13
Google Says Its Tensor-Powered AI Supercomputer Is Faster Than Nvidia


Alphabet’s Google on Tuesday launched new particulars concerning the supercomputers it makes use of to coach its synthetic intelligence fashions, saying the programs are each quicker and extra power-efficient than comparable programs from Nvidia.

Google has designed its personal {custom} chip known as the Tensor Processing Unit, or TPU. It makes use of these chips for greater than 90 % of the corporate’s work on synthetic intelligence coaching, the method of feeding knowledge by fashions to make them helpful at duties like responding to queries with human-like textual content or producing pictures.

The Google TPU is now in its fourth technology. Google on Tuesday printed a scientific paper detailing the way it has strung greater than 4,000 of the chips collectively right into a supercomputer utilizing its personal custom-developed optical switches to assist join particular person machines.

Improving these connections has turn into a key level of competitors amongst firms that construct AI supercomputers as a result of so-called massive language fashions that energy applied sciences like Google’s Bard or OpenAI’s ChatGPT have exploded in measurement, that means they’re far too massive to retailer on a single chip.

The fashions should as a substitute be break up throughout hundreds of chips, which should then work collectively for weeks or extra to coach the mannequin. Google’s PaLM mannequin – its largest publicly disclosed language mannequin to this point – was skilled by splitting it throughout two of the 4,000-chip supercomputers over 50 days.

Google mentioned its supercomputers make it straightforward to reconfigure connections between chips on the fly, serving to keep away from issues and tweak for efficiency features.

“Circuit switching makes it easy to route around failed components,” Google Fellow Norm Jouppi and Google Distinguished Engineer David Patterson wrote in a weblog put up concerning the system. “This flexibility even allows us to change the topology of the supercomputer interconnect to accelerate the performance of an ML (machine learning) model.”

While Google is barely now releasing particulars about its supercomputer, it has been on-line inside the corporate since 2020 in a knowledge centre in Mayes County, Oklahoma. Google mentioned that startup Midjourney used the system to coach its mannequin, which generates recent pictures after being fed a number of phrases of textual content.

In the paper, Google mentioned that for comparably sized programs, its supercomputer is as much as 1.7 instances quicker and 1.9 instances extra power-efficient than a system primarily based on Nvidia’s A100 chip that was in the marketplace concurrently the fourth-generation TPU.

Google mentioned it didn’t evaluate its fourth-generation to Nvidia’s present flagship H100 chip as a result of the H100 got here to the market after Google’s chip and is made with newer know-how.

Google hinted that it is likely to be engaged on a brand new TPU that may compete with the Nvidia H100 however offered no particulars, with Jouppi telling Reuters that Google has “a healthy pipeline of future chips.”

© Thomson Reuters 2023


Smartphone firms have launched many compelling units over the primary quarter of 2023. What are a number of the finest telephones launched in 2023 you should purchase at present? We talk about this on Orbital, the Gadgets 360 podcast. Orbital is on the market on Spotify, Gaana, JioSaavn, Google Podcasts, Apple Podcasts, Amazon Music and wherever you get your podcasts.
Affiliate hyperlinks could also be routinely generated – see our ethics assertion for particulars.



Source hyperlink