Google said Tuesday that it is one among the few alternatives to Popular AI chips from Nvidia Is now available to developers — and took shots at Microsoft and Amazon, too.
The tech giant’s latest AI chip, Cloud TPU v5p, first announced in Decemberon the identical day as its Gemini chatbot. The new TPU, or tensor processing unit, can train large language models almost 3 times faster than its predecessor, Google’s TPU v4, the corporate claims. Large language models (LLM) support AI chatbots corresponding to ChatGPT.
“Now in the fifth generation of these improvements [to Google’s TPUs] helped customers train and operate state-of-the-art language models,” Google CEO Sundar Pichai said at the corporate’s annual Google Cloud Next conference on Tuesday in Las Vegas.
Google’s announcement was one other milestone in history Big Tech AI arms race. Nvidia is a major supplier of AI chips generally known as graphics processing units or GPUs. And Alphabet, the parent of Google, is one among Nvidia’s largest customers, behind Microsoft and Meta, the parent of Facebook.
“[Google’s] investments [to develop new AI hardware] has put us at the forefront of changing the AI platform,” Pichai said.
Google’s rivals Microsoft, Amazon and Meta have also done so have developed their own AI chips.
Still, it’s clear that Nvidia does Very important to Google. In the same blog post announcing its latest AI chip, Google mentioned Nvidia 20 times. Just below the TPU v5p details, the company said it is updating its A3 supercomputer, which runs on Nvidia GPUs. And Google reminded customers that it uses it Nvidia’s newest chip, Blackwellin its AI hypercomputer.
read more: : Intel says its new AI hardware ‘significantly outperforms Nvidia’
After discussing Google’s new AI chip, Google Cloud CEO Thomas Kurian gave a flashier demo of his new, The central unit is based on an arm called Google Axion. In a game show-style display, Kurian took the stage during his keynote speech with Axion in hand, flashing the chip to applause.
Google Axion is a new rival to processors from Microsoft and Amazon, which have already produced their own ARM-based computing chips. British technology company Arm is licensing the chip infrastructure project actual chip manufacturers to rely on. Google’s release of Axion is the first time the company has used Arm’s chip infrastructure in a processor.
Google claims that Axion delivers “up to 30% higher performance than the fastest general-purpose ARM-based instances currently available within the cloud” and “up to 50% higher performance and up to 60% higher energy efficiency” than other general-purpose ARM chips.
Google customers can use Axion as a part of their cloud services, which mainly implies that these users would like to run their cloud services on a more powerful computer processor in Google’s physical data centers. Google also told Reuters that “Arm customers anywhere can easily deploy Axion without having to rearchitect or rewrite applications.”
Credit : qz.com