AI microchip supplier Nvidiathe world’s most valuable company by market cap remains dependent on a handful of anonymous clients who collectively contribute tens of billions of dollars in revenue.
The AI chip darling once again warned investors in its quarterly 10-Q filing with the SEC that it has matters so fundamental that their demand exceeded the threshold of ten percent of Nvidia’s consolidated global turnover.
An elite trio of particularly deep clients, for example, bought between $10 billion and $11 billion worth of goods and services in the first nine months that ended in late October.
Fortunately for Nvidia investors, this it won’t change anytime soon. Mandeep Singh, global head of technology research at Bloomberg Intelligence, believes that founder and CEO Jensen Huang is predicting that the spending will not stop.
“The data center training market could reach $1 trillion without real delay,” at which point Nvidia’s share will almost certainly drop from its current 90%. But it can still generate hundreds of billions of dollars in revenue each year.
Nvidia remains in limited supply
Outside of defense contractors living out of the Pentagon, it’s highly unusual for a company to have such a concentration of risk among just a few customers, let alone one that’s willing to become the first one worth an astronomical sum. 4 billion dollars.
Looking closely at Nvidia’s accounts for three months, there were four anonymous whales that total was nearly every second dollar of sales in the fiscal second quarter, with at least one of those falling this time around, with only three now meeting that criteria.
Singh said luck anonymous whales will have Microsoft, Meta and maybe Super Micro. But Nvidia declined to comment on the speculation.
Nvidia refers to A, B and C as customers, all of whom said they bought a collective $12.6 billion in goods and services. That was more than a third of Nvidia’s $35.1 billion recorded in the third fiscal quarter through the end of October.
Their share was also split equally, with each taking 12%, suggesting that they were likely to receive the maximum number of chips allocated to them rather than as much as they wanted.
One would agree with founder and CEO Jensen Huang’s comments that his company limited supply. Nvidia cannot produce more chips because it has outsourced the wholesale manufacturing of its industry-leading AI microchips to Taiwan’s TSMC and has no production facilities of its own.
Middlemen or the end user?
Importantly, Nvidia’s designation of key anonymous customers as “Customer A”, “Customer B”, etc. is not fixed from fiscal period to fiscal period. They can and do change venue, with Nvidia keeping its identity a trade secret for competitive reasons; those customers certainly won’t like to see how much money investors, employees, critics, activists and rivals are spending on Nvidia chips.
For example, a party named “Customer A” purchased approximately $4.2 billion in goods and services during the last fiscal quarter. However, it seems to have been less so in the past, as it has not crossed the 10% mark in the first nine months in total.
Meanwhile, “Customer D” appears to have done the exact opposite, reducing purchases of Nvidia chips in the latest fiscal quarter, but still accounting for 12% of year-over-year turnover.
As their names are secret it is difficult to say whether they are middle men or not disgusted Super micro computereither supplying data center hardware or end users like Elon Musk’s xAI. The latter, for example, came out of nowhere to build its new one Memphis computing cluster just three months.
Among Nvidia’s longer-term risks is a shift from training to inference chips
Ultimately, however, there are only a handful of companies with the capital to compete in the AI race, as training large language models can be prohibitively expensive. Typically, cloud computing is hyperscalers Microsoft.
the oracle for example, recently announced plans to build one zettascale data center 131,000 with Nvidia’s latest generation Blackwell AI training chip, which would be more powerful than any site in existence yet.
It is estimated that the electricity required to run a large computing cluster would be almost equivalent to the output power. two dozen nuclear power plants.
Bloomberg Intelligence analyst Singh sees just a few longer-term risks for Nvidia. For one thing, some hyperscalers will likely reduce requests in the enddiluting its market share. He is such a candidate the alphabetwho has his own own Training chips called TPU.
Second, its superiority in training is not consistent with inference, which runs generative AI models after they have already been trained. The technical requirements here are not nearly state-of-the-art, which means there is much more competition not only from rivals AMD but also companies with customized silicon Tesla. Eventually, inference will become much more business meaningful as more and more businesses use AI.
“A lot of companies are trying to focus on that inference option because you don’t need a high-end GPU accelerator chip for that,” Singh said.
Asked if this longer-term shift to inference was a greater risk than eventually losing market share in training chips, he replied: “Absolutely.”