Nvidia Unveils Enhanced AI Chip H200 with Major Upgrades
In a significant development on Monday, Nvidia (NVDA.O) announced the introduction of advanced features in its premier artificial intelligence (AI) chip.
The new chip, named H200, is set to be launched next year and has already garnered partnerships with tech giants Amazon.com (AMZN.O), Google from Alphabet (GOOGL.O), and Oracle (ORCL.N).
The H200 chip represents a notable leap forward from Nvidia's existing top-tier H100 chip.
The primary enhancement comes in the form of increased high-bandwidth memory, a crucial component influencing the chip's data processing speed and efficiency.
As a dominant player in the AI chip market, Nvidia's technology underpins various services, including OpenAI's ChatGPT.
The augmentation of high-bandwidth memory and an accelerated connection to the chip's processing elements translates to faster response times for AI services, allowing them to generate answers more swiftly.
The H200 boasts an impressive 141-gigabytes of high-bandwidth memory, a substantial upgrade from the 80 gigabytes in its predecessor, the H100. While Nvidia has not disclosed the specific suppliers for the memory in the new chip, Micron Technology (MU.O) announced in September that it was actively working towards becoming a supplier for Nvidia.
Nvidia's memory procurement also involves transactions with Korea's SK Hynix (000660.KS), which recently highlighted the role of AI chips in rejuvenating its sales.
In another announcement on Wednesday, Nvidia disclosed that leading cloud service providers, including Amazon Web Services, Google Cloud, Microsoft Azure, and Oracle Cloud Infrastructure, will be among the initial platforms to offer access to H200 chips.
Additionally, specialized AI cloud providers such as CoreWeave, Lambda, and Vultr will also integrate the H200 chip into their services.
This move solidifies Nvidia's position at the forefront of the AI hardware landscape, promising enhanced capabilities and faster AI processing for a diverse range of applications.
Stephen Nellis / Reuters