On September 20, an article by VC giant Sequoia pushed Nvidia and even the entire AI industry to the forefront. David Cahn, a partner at Sequoia, believes that under conservative estimates, Nvidia's $50 billion in GPU sales corresponds to $100 billion in data center expenditures by other companies. Assuming a 50% profit margin, the AI industry needs $200 billion in revenue to offset this expenditure. However, it currently only has $75 billion in annual revenue, leaving a gap of $125 billion.
Cahn pointed out that GPU production capacity is overcapacity and expects the "money incinerator" model to repeat itself in the AI field.
After the incident fermented for three days, Guido Appenzeller, special advisor to Silicon Valley venture capital giant A16Z and founder of AI startup 2X, posted nearly 10 tweets, not only overturning Sequoia’s estimate of AI’s profitability, but also pointed out that Sequoia’s most fundamental problem was that it underestimated the impact of AI’s historic revolution.
Appenzeller: AI will subvert all software, and the revenue gap does not exist
In a series of tweets, Appenzeller pointed out three major errors in Cahn's article.
First of all, Cahn used a $200 billion figure at the beginning of the article to attract attention, but Appenzeller believes that there is a problem with the calculation process of this figure.
Appenzeller pointed out that Cahn added together the purchase cost (capital expenditure) of the GPU, the annual operating costs, the cumulative revenue during the GPU life cycle, and the annual revenue from AI applications, and came up with a seemingly exaggerated figure of $200 billion.
However, a more appropriate calculation would be based on the annual return on investment that GPU buyers are able to obtain after investing capital. In other words, the return on investment for GPU buyers should be calculated,
Second, Appenzeller believes that GPU electricity costs are also overstated.
According to Appenzeller, an H100 PCIe GPU costs about $30,000 and consumes about 350 watts of power. Taking into account servers and cooling, the total power consumption is likely to be around 1 kilowatt.
If the electricity price is US$0.1/kWh, then this H100 GPU will have
The above two estimates are not the most fatal, Appenzeller believes,
Appenzeller said that AI models are an infrastructure component just like CPUs, databases and networks. Now, almost all AI software uses CPU, database and network, and this will be the case in the future.
Therefore, AI models will profoundly affect all software and IT systems, and their scope of influence goes far beyond the narrow areas analyzed in the article. The article ignores the status of AI models as future software infrastructure and therefore underestimates the true significance of the AI revolution.
Can startups fill this gap? Cahn believes that there is a "big opportunity". The technological leap in the field of AI and the unprecedented wave of GPU purchases are always good news for mankind.
In historical technology cycles, overbuilding infrastructure has tended to burn capital but also unlock future innovation by lowering the marginal cost of new product development.
So, the question is, can the AI industry earn enough US$200 billion?
Appenzeller said:
Over $200 billion is spent annually on network infrastructure,
No, but Google uses network infrastructure to sell ads, and the revenue generated is shown as advertising revenue, not "network software" revenue, and the revenue achieved by Microsoft Office 365 will not be labeled as "network software" revenue.
In other words, revenue from infrastructure will be labeled as different revenue categories depending on the sector.
Finally, Appenzeller concluded,
NVIDIA customers have been slow to make money, and the patience of capital is running out at "hundreds of billions"
It is worth noting that Sequoia’s concerns about AI’s monetization capabilities are not unreasonable.
A previous article by Wall Street Insights mentioned that the huge investment in each GPU must eventually be converted into end customer value, so that the industry can continue to develop in the long run.
At present, as the core beneficiary of the "Nuggets buy shovel" logic, Nvidia's performance in the first two quarters of this year has been quite impressive. However, at the downstream application layer, AI investment has only increased, but performance has not improved.
Benefiting from the huge demand brought by large model training, the orders and performance of AI infrastructure manufacturers have been continuously verified. However, B-side applications are still in their early stages. Most AI application manufacturers have not yet entered the commercialization stage. From the perspective of delivery time, it is expected to be 2-3 quarters later than the infrastructure layer.
If gold diggers can't make money and their sales of shovels explode, of course they won't last long. In the past month, Nvidia's stock price has fallen more than 11%, returning to the level in June this year.
Under the premise that cost reduction and efficiency improvement are still the main theme of the development of global technology stocks, the patience of the capital market is running out.