Nvidia CEO Jensen Huang unveiled the long-awaited new processor from the AI chipmaker on Monday, saying tech giants like Microsoft and Google are already eagerly awaiting his arrival.
Huang announced it in the method the corporate’s closely watched GPU technology conferenceOr General terms and conditionswhich was dubbed “Woodstock AI” by the organizers. employees AND analysts each. The annual conference in San Jose, California was held just as Nvidia entered the market Wall Street frenzy for AI stocks start 2024 – exceeding expectations of high earningsbecoming the primary chipmaker to achieve a market capitalization of $2 trillionand past growth corporations, including Amazon grow to be the third most respected company on the earth.
Nvidia’s growth is driven by the fee of its $40,000 H100 chips, which support the so-called large language models needed to run generative AI chatbots like OpenAI’s ChatGPT. The chips are called graphics processors or graphics processing units.
On Monday, Huang unveiled Nvidia’s next-generation “Blackwell” graphics processor, named after mathematician David Blackwell, the primary Black scientist admitted to the National Academy of Sciences. The Blackwell chip consists of 208 billion transistors and can give you the chance to run AI models and queries faster than its predecessors, Huang said. Blackwell chips were a success The highly wanted H100 chip from Nvidia, which was named after computer scientist Grace Hopper. Huang called Hopper “the world’s most advanced graphics processor currently in production.”
“Hopper is fantastic, but we need bigger GPUs,” he said. “So ladies and gentlemen, I would like to introduce you to a very large graphics processor.”
Huang said Microsoft, Google parent Alphabet and Oracle are among the many tech giants preparing for Blackwell. Microsoft and Google are Nvidia’s two biggest customers for its H100 chips.
Nvidia shares were flat on Monday, but are up greater than 83% this yr and greater than 241% over the past 12 months.
read more: : Is Nvidia stock in a bubble that is about to burst? Wall Street cannot make up its mind
During his keynote speech at Nvidia’s conference on Monday, Huang announced a new partnership with computer software makers Cadence, Ansys and Snyopsys. Cadence, Huang said, is constructing a supercomputer with Nvidia graphics processors. Huang said Nvidia’s AI foundry works with SAP, ServiceNow and Snowflake.
Huang also shouted out Dell founder and CEO Michael Dell, whose company works with Nvidia, within the audience. Dell Expands Its AI Offerings customers, including new enterprise data storage with Nvidia’s AI infrastructure.
“Every company will have to build artificial intelligence factories,” Huang said. “And it turns out Michael is here and will be happy to take your order.”
Huang also announced that Nvidia is creating a digital Earth model to predict weather patterns using a new generative artificial intelligence model, CorrDiff, that may generate images with 12.5 times higher resolution than current models.
Huang said Nvidia’s Omniverse computing platform now supports streaming Apple Vision Pro headsetand that Chinese electric vehicle manufacturer BYD is acquiring Nvidia solutions next-generation Thor computer.
Huang concluded his two-hour speech accompanied by two robots, the orange and green BD Star Wars droids, which he believes are powered by an Nvidia processor Jetson’s computer systemsand with whom he learned to walk Isaac Sim by Nvidia.
Throughout the week at GTC, Nvidia researchers and executives will likely be joined by influential players in the substitute intelligence industry — including Brad Lightcap, OpenAI’s chief operating officer, and Arthur Mensch, CEO of OpenAI’s French rival, Mistral AI — to host sessions on topics starting from innovation to ethics.
Microsoft and Meta are Nvidia’s biggest customers for the H100 chipwith each tech giants spending $9 billion on chips in 2023. Alphabet, Amazon and Oracle also spent essentially the most on chips last yr.
But the frenzy around Nvidia H100 chips has raised concerns about shortages, and competitors trying to get ahead within the AI race have began constructing their very own versions of the chips. Amazon was working on two chips called Inferentia and Tranium, while Google was working on its Tensor processing units.
Credit : qz.com