CNBC reports that Nvidia’s upcoming AI chip, named Blackwell, is set to be priced at over $30,000.
This information was mentioned by the company’s CEO Jensen Huang.
The new generation graphics processor, tailored for artificial intelligence tasks, is expected to be highly sought after for training and deploying AI applications like ChatGPT. Huang mentioned during an interview on “Squawk on the Street” that significant technological innovations were required to bring Blackwell to market, with Nvidia investing approximately $10 billion in research and development costs.
The projected price range for Blackwell, between $30,000 and $40,000 per unit, indicates a similarity in pricing to its predecessor, the H100 (also known as the Hopper), which ranged from $25,000 to $40,000 per chip according to analyst estimates.
The introduction of the Hopper series in 2022 marked a notable price increase for Nvidia’s AI chips compared to previous generations.
Huang explained that the cost of Nvidia’s new AI chips isn’t solely attributed to the chip itself but also encompasses expenses related to designing data centers and integrating them into other companies’ data centers.
Since the onset of the AI boom in late 2022, driven by the introduction of OpenAI’s ChatGPT, Nvidia’s AI chips have contributed to a threefold increase in quarterly sales. Major players in the AI industry, such as Meta, have heavily relied on Nvidia’s H100 for training their AI models. Meta, for instance, disclosed plans to purchase hundreds of thousands of Nvidia H100 GPUs this year.
Recently, Nvidia unveiled at least three variants of the Blackwell AI accelerator — the B100, B200, and GB200, with the latter pairing two Blackwell GPUs with an Arm-based CPU. These models feature slightly different memory configurations and are slated for release later this year.