Ivana Delevska | Oct 31, 2024 09:47AM ET
As Nvidia (NASDAQ:NVDA) stock price was approaching new highs during the summer, we dove deep into the Blackwell order book to understand if the stock price move was justified and if there is further upside ahead.
We were surprised to find that the order book was even stronger than we had modeled and a step change from the previous generation chips (H100 Hopper). Orders for Hopper were averaging in the 10-30Ks in '23 when it was introduced, vs. orders for Blackwell were coming in the 100Ks.
Over the past two months, there has been a myriad of data points confirming our view. However, we also learned something incremental: it now appears that the mix of orders will be much more heavily weighted towards the GB200 NVL72 configuration, which carries significantly higher margins.
Some background...
What we learned...
This is great news for Nvidia but also very positive for the rest of the value chain, including networking plays: connectors, cables, switches, interconnects, etc.
Nvidia, the GB200 NVL 72 is important because it is margin accretive as it contains many Nvidia networking products).
While Hopper (the prior generation GPU) took Nvidia 's Data Center revenues from $20B to >$100B, Blackwell has the potential to take it above $200B.
Here is another key milestone that happened over the past few weeks that is very well-known in the tech/AI world but under-appreciated by investors.
OpenAI' introduced a different type of model o1 (a.k.a.) which is parallel to GPT-4 but fundamentally distinct. The o1 is important because it is not just a one-off model but a new paradigm for training models.
The key difference between GPT-4 and o1 is that the latter can reason. The model spends a few minutes thinking before generating an answer. To do this, an additional step of Inference Compute is required, which will increases the demand for compute significantly.
Current AI models are trained on a data set and infer or generate conclusions based on that data set. While this works well for many applications e.g., search, the models hit a plateau and don't improve. The only way to progress is to train a new generation of models.
In the future, as compute costs decline, companies will start introducing models that can reason and continuously learn. The applications for these will be anything that requires real-time response and accurate action. The o1 model is just one step in that direction. Continuous learning would take thinking to the next level.
Nvidia's CEO, Jensen Huang, highlighted this paradigm shift in a keynote at Stanford this summer: "Today, we learn and apply (train => inference); in the future, we will have continuous learning."
Note that inference today is a large market (we estimate >50%), but Inference Compute is a whole new use case.
In our recent webinar, we explored the evolving landscape of the AI data center market. For those who missed it, here’s a concise overview of our findings.
The AI data center value chain is split into two parts: inside the rack and outside the rack. Processors, Networking, and Memory are key areas inside the rack. Thermal Management, Power Management, and Manufacturing Equipment are key areas outside of the rack.
The key driver for Data Center investment has been capex spending by the Cloud Service Providers (CSPs). Capex growth surprised to the upside this year and we expect it to surprise again next year as companies are starting to provide qualitative commentary this quarter.
In addition to the cloud service providers, enterprises and governments are now coming to the market with incremental demand. As an example, xAI's data center cluster, Colossus (100K H100 GPU cluster), just came online and is expected to double in size in a few months! Moreover, per Elon Musk's comment, xAI has 300 B200s on order for 2025
This is just one example. Every other model provider will have to follow if they want to stay competitive.
Who stands to benefit? Let's dive into the value chain.
Processors are the largest market segment, representing >60%% of the total market, and are expected to grow at a 40% CAGR from 2023 to 2027. There are companies that design the chips (e.g., Nvidia, AMD (NASDAQ:AMD)), which is an asset-light business model, and companies that manufacture them (e.g., TSMC), which is an asset-heavy business model.
Nvidia is the leader on the chip design side with over 80% market share, but we expect, over time, both custom chips (i.e., chips developed by the hyperscalers) and AMD to be able to capture market share.
The main point that people don't appreciate is that software is not automatically accelerated with a GPU, and accelerated computing requires algorithms to be re-designed in order to be able to use a GPU effectively. This is where NVIDIA differentiates itself with over 400 CUDA libraries, which deliver dramatically higher performance than alternatives.
Consequently, although at face value, Nvidia's and AMD's chips look comparable, most customers are often time unable to get similar utilization out of the AMD or custom chips. There is a subset of more sophisticated customers that build their own software and can get an even more attractive total cost of ownership (TCO) from competitors compared to using Nvidia's chips, but they have to do the heavy lifting.
Examples are established tech companies like Meta (NASDAQ:META) & Microsoft, which are AMD's customers, and Databricks, an AWS custom chips customer.
In addition, with Nvidia's product innovation cycle shortened to one year, it is challenging for competitors to keep up with the development timeline. At the current utilization that customers are getting, AMD is considered to be a year behind Nvidia.
Despite Nvidia's product being superior, there is room for competitors due to capacity constraints and supply security. Some thoughts on the competitive landscape:
Networking is one of the areas with the highest growth potential and solid business models. Networking is expected to be a key driver behind achieving performance improvement for each generation of GPUs.
The key to growth in networking is that growth is multiples of the growth in the compute units, as they all need to be connected to each other.
Here are some highlights.
We expect this market to reach $100B by 2027. Key components of the networking market are:
While there are many specialized networking products, there are relatively few players competing in each specific technology and are therefore able to capture high margins.
Memory has historically been a more challenging and cyclical market due to competitive dynamics. AI requires High Bandwidth (NASDAQ:BAND) Memory, which is more complex to make, but the question is whether this time will be different from a competitive standpoint. The silver lining is that HBM takes up capacity from traditional Memory, tightening the market.
Key players in the memory market are SK Hynix, Micron (NASDAQ:MU) and Samsung (KS:005930). SK has a leadership position in HBM memory but both Micron and Samsung are scaling their capacity in '25.
Power Management is an important area both outside of the rack and as it extends to the grid. It is a relatively consolidated market with companies like Eaton (NYSE:ETN), Schneider, commanding significant market share.
Due to the increased focus on thermal management, liquid cooling has emerged as a very attractive market.
Further back in the value chain, there are the Semi-Equipment providers and Test and Measurement providers. People often miss that these sub-segments follow their own cycle (different from AI GPUs' cycle).
For example, manufacturing equipment had a strong capex cycle (21-23), and now customers are digesting the previously added capacity. In many cases, like in memory, the capacity for traditional products can be converted to make AI products, which, given the muted economy, does not result in more limited incremental demand. Even TSMC, a company that is leading the way in manufacturing, sees a muted capex cycle and guided to the lower end of its prior capex guidance.
As we get further into the AI opportunity and the $500B near-term opportunity, there will be a need for incremental manufacturing capacity, which will be positive for the equipment providers. But we are not there just yet.
One of the most significant medium-term bottlenecks for the Data Center build-out is power availability. One large data center consumes the equivalent of a small city and by '27 we expect that Data Centers will consume the equivalent of 40 million houses or 1/3 of the US residential power market by 2027.
We are projecting that data centers will increase their share of power demand from approximately 3% in 2023 to over 10% by 2027.
Trading in financial instruments and/or cryptocurrencies involves high risks including the risk of losing some, or all, of your investment amount, and may not be suitable for all investors. Prices of cryptocurrencies are extremely volatile and may be affected by external factors such as financial, regulatory or political events. Trading on margin increases the financial risks.
Before deciding to trade in financial instrument or cryptocurrencies you should be fully informed of the risks and costs associated with trading the financial markets, carefully consider your investment objectives, level of experience, and risk appetite, and seek professional advice where needed.
Fusion Media would like to remind you that the data contained in this website is not necessarily real-time nor accurate. The data and prices on the website are not necessarily provided by any market or exchange, but may be provided by market makers, and so prices may not be accurate and may differ from the actual price at any given market, meaning prices are indicative and not appropriate for trading purposes. Fusion Media and any provider of the data contained in this website will not accept liability for any loss or damage as a result of your trading, or your reliance on the information contained within this website.
It is prohibited to use, store, reproduce, display, modify, transmit or distribute the data contained in this website without the explicit prior written permission of Fusion Media and/or the data provider. All intellectual property rights are reserved by the providers and/or the exchange providing the data contained in this website.
Fusion Media may be compensated by the advertisers that appear on the website, based on your interaction with the advertisements or advertisers.