The hottest AI commodity right now isn’t ChatGPT, it’s the $40,000 chip fueling a spending frenzy.

The hottest AI commodity right now isn't ChatGPT, it's the $40,000 chip fueling a spending frenzy.

The Rise of Hopper: The $40,000 Workhorse Chip Powering AI Models


Move aside, ChatGPT. There’s a new hot commodity in the world of AI: a $40,000 workhorse chip known as Hopper. Named after the trailblazing computer scientist Grace Hopper, this ultra-expensive little chip, officially called the H100 tensor core GPU, has become the sought-after property of Silicon Valley giant Nvidia.

So, what is all the fuss about? The answer is simple yet profound: the H100 processor is driving the generative AI revolution by providing the essential computing power needed to process the billions of parameters that shape the output of large language models like ChatGPT. Without these processors, the advancement of AI models risks hitting a roadblock. This explains the mad dash to acquire H100 processors as the fear of a shortage has spurred buyers into action.

Even in the Gulf region, the appetite for Nvidia’s processors has reached a fever pitch. According to the Financial Times, Saudi Arabia and the UAE are scrambling to secure thousands of H100 units as they ramp up their AI ambitions. In the startup ecosystem, where much of the pioneering work in AI is taking place, the scarcity of these GPUs has left wealthy venture capitalists concerned that their startups will fall behind without them, prompting them to buy units whenever they can.

However, the high demand for H100 chips has also given rise to some unique scenarios. The Verge reported that CoreWeave, a startup backed by Nvidia, has put a collection of H100 GPUs up as collateral to secure a whopping $2.3 billion loan, enabling them to amass an even larger inventory of H100 units.

Such is the frenzy for these processors that an underground market has sprung up, aided in part by US sanctions that prevent the export of Nvidia’s top chips to China and Hong Kong, as reported by ANBLE in June. Elon Musk himself, the enigmatic entrepreneur and tech visionary, has made extraordinary moves to acquire chips for generative AI. Insider reported in April that Musk purchased approximately 10,000 GPUs as part of his AI-focused effort to transform his company, X, into a super app. Musk has even announced the formation of his own AI company, xAI, in response to OpenAI.

The magnitude of these chip-buying maneuvers for generative AI is truly remarkable. While OpenAI’s ChatGPT has been the face of the new AI era since its launch, Nvidia’s processors, a key driver of the company’s ascent to a $1 trillion market capitalization in May, possess a staying power that ChatGPT does not yet enjoy. Furthermore, Nvidia has recently unveiled a next-generation version of the H100, the GH200, slated for launch in the second quarter of next year, indicating that the craze for these chips will only intensify.

In conclusion, Hopper, the H100 tensor core GPU, has become the hottest commodity in AI. With its unprecedented computing power, it is driving the generative AI revolution and enabling the advancement of large language models. From tech giants to startup founders, everyone is scrambling to get their hands on these valuable processors, whether it’s for AI ambitions or to secure their company’s competitive edge. The race for H100 chips has created extraordinary buying maneuvers, collateralized loans, and even an underground market. As Nvidia continues to innovate and releases new iterations like the GH200, the AI chip frenzy is set to reach new heights.