The recent AI boom has reignited, with NVIDIA achieving a "five-day rise," and its stock price is just a step away from a new high. The "new AI narrative"—increased demand for inferencing computational power is opening up the demand space for NVIDIA's chips.
On Tuesday of this week, the computational leader NVIDIA held a three-day AI roadshow in New York, attended by CEO Jensen Huang, CFO Colette Kress, and other members of the management team. As expected, the management team is highly enthusiastic about the short-term and long-term prospects of AI, especially in terms of innovation and expansion in the field of AI computing, emphasizing that there is substantial growth ahead.
NVIDIA's management stated that the current stage is still early in the AI cycle. With the release of the OpenAI o1 model, a new AI narrative is unfolding, beginning to shift towards solving more complex inferencing problems, which will increase the demand for hardware combinations, and NVIDIA's upcoming rack products are the best solution.
The long-term vision of AI is that deep thinking will allow every company in the world to hire a large number of "digital AI employees" capable of performing challenging tasks.
Advertisement
In this regard, Morgan Stanley pointed out in its report:
The complexity and demand for inferencing calculations are growing exponentially, especially for task-oriented inferencing needs, which brings new growth opportunities for NVIDIA. NVIDIA's full-stack solutions have significant advantages in solving such complex problems.
In the short term, the progress of the Blackwell product line is on schedule, and the products for the next 12 months have already been sold out, indicating strong market demand. NVIDIA expects a strong performance in 2025 and views 2026 as the early stage of a long-term investment cycle.
Inferencing calculations will grow exponentially with "deep thinking."Morgan Stanley indicated that the management team frequently mentioned OpenAI's new model o1, which requires a longer "thinking" time during inference:
During the conference, Huang Renxun often referred to OpenAI's recently released o1 model, which can generate a series of thoughts before responding to queries. The model outputs without latency constraints and allows it to "think" for as long as possible before responding. While OpenAI has not explicitly stated the cost difference of o1 inference, some data sources suggest that the cost may be around ten times that of GPT-4.
For NVIDIA's growth story, the bigger picture is that NVIDIA is excited about the capabilities that models trained on Blackwell or Rubin systems will be able to achieve in the next two to three generations. For inference calculations, to serve a GPT-6 level model with these advanced features at low latency, the additional computing power required may be an order of magnitude higher than what we consider today's leading level of computational intensity.
Looking further, Morgan Stanley stated that NVIDIA's long-term vision is that within the next decade, companies will have thousands of "digital employees" performing complex tasks—such as programmers, circuit designers, marketing project managers, paralegals, etc. Among them, the enhancement of inference computing will require more sophisticated hardware, and NVIDIA's Blackwell system, especially the rack-scale system, is considered a breakthrough technology.
NVIDIA's market share is expected to continue growing in 2025.
The Morgan Stanley report also pointed out that inference computing will grow exponentially, which means a significant increase in the demand for investment in inference hardware, benefiting NVIDIA's business in the long term:
NVIDIA positions Blackwell—especially the rack-scale system—as a breakthrough technology to address these issues. Blackwell brings a more capable processor to the AI market, but the most significant innovation may come from the GB200 system, which enhances the ability to treat an entire rack as a massive GPU by introducing the Grace CPU into the system and introducing more complex NVLink chip connections, allowing each GPU in a 36 or 72 GPU rack to collaborate simultaneously with all other GPUs, placing all GPUs within the same NVLink domain, and greatly enhancing this capability.
In the short term, the advancement of the Blackwell product line is on schedule, with products for the next 12 months already sold out, indicating strong market demand and预示着 a continued high growth trend in shipments throughout the year. In 2024 and 2025, NVIDIA's AI processor market share is likely to increase, with the shipment trend expected to continue growing.Regarding the recent stock performance of NVIDIA, Morgan Stanley has expressed that it remains bullish on NVIDIA's long-term prospects, rating it as "overweight" with a target price of $150. However, it also acknowledges that as the stock price rebounds, the short-term upside potential has been somewhat increased by raising profit expectations.
As consensus shifts towards very high expectations for the fiscal year 2025, the debate at our current position tends to shift towards fiscal year 2026 and beyond. Although we are optimistic about the long-term prospects, these debates are more difficult to resolve.
The company's quarterly performance has significantly exceeded their guidance, with gross margins being 1 percentage point or more higher than the guidance, which has become expected. At some point, the magnitude of the increase may rise, and there are some signs indicating that there might be more room for an increase this quarter.
Leave A Comments