February 13, 2025
“We remain committed to our “middle lane” approach, seeking growth at a reasonable price while carefully assessing risks such as customer concentration and the sustainability of margins in an increasingly uncertain and evolving AI landscape.”
In artificial intelligence (AI), inference is the process of using a trained model to make predictions or decisions. Once trained on vast amounts of data, inference enables it to apply what it has learned to new, unseen data, generating outputs such as classifications, predictions, or recommendations. Similarly, stock picking involves analyzing historical data, market trends, and financial indicators to forecast a company’s future cash flows and stock performance. While AI inference relies on algorithms to make data-driven predictions, stock picking combines quantitative analysis with human judgment to interpret data and predict potential outcomes, with skilled analysts often seeking to consider factors beyond what is directly captured in the data.
With that in mind, let’s apply some “human” inference to judge whether the AI boom is truly transformative or merely hype.
Nvidia (NVDA), the face of AI and now part of the illustrious Magnificent 7, the seven dominant U.S. tech giants comprising ~30% of the S&P 500 (Apple, Microsoft, Alphabet, Amazon, Nvidia, Meta, and Tesla), is named after the Latin word invidia, meaning ‘envy’ in English. Given NVDA’s meteoric rise, it’s safe to say many companies are feeling exactly that. While AI hype is high, NVDA’s stock price isn’t driven by sentiment alone. Sales have soared over 420% since OpenAI’s ChatGPT was released in the fall of 2022, reaching over $113 billion in the trailing twelve months (see Chart 1). This massive growth is driven by a ramp in their data center segment, which now accounts for 86% of total revenue, reaching nearly $100 billion over the past year (this segment had just $4.6 billion in sales in the summer of 2020).
At the heart of this expansion are NVDA’s graphics processing units (GPUs), particularly the H100 and the newly designed B200, which provide the high-performance computing necessary for machine learning and data center workloads, including applications like ChatGPT. As the first to market with these high-demand GPUs, NVDA has gained significant pricing power, driving its profit margins higher, a key measure of how efficiently a company turns sales into earnings. Trailing twelve month operating profit margin reached 63%, up from under 20% just six quarters ago, far exceeding peers like Intel and Advanced Micro Devices (see Chart 2).
Despite the remarkable rally in the stock price, earnings, and earnings expectations, have kept pace bringing its price-to-earnings ratio down compared to previous years. After peaking at 65x earnings in late 2021, NVDA now trades in line with many of its Magnificent 7 peers (see Chart 3). However, this apparent discount is largely driven by sky-high expectations of 56% earnings growth over the next twelve months (NTM), continuing the same trajectory we’ve seen in recent years.
Using human inference, it’s clear this has been an impressive run, but the real question is: will it last?
NVDA’s recent success is largely driven by its major customers. While the company has a retail presence in gaming, its data center business follows an enterprise model, supplying high-performance chips directly to large corporations. NVDA doesn’t name its key clients but provides some insight, labeling them as Customer A, B, C, and D, along with their respective revenue contributions. In the latest quarter, these four customers collectively accounted for 45% of NVDA’s total revenue. Identifying these mysterious customers isn’t too difficult as they are likely many of the Magnificent 7, more specifically the hyperscalers (the cloud service giants that power massive data centers). These include Microsoft, Amazon, Alphabet, and Meta.
Beyond the hyperscalers, Tesla and Oracle are also likely key customers, given their significant investments in machine learning and AI infrastructure. Tesla has been ramping up its AI-driven autonomous driving capabilities, while Oracle has been expanding its cloud services and AI offerings.
The good news for NVDA is that despite many of its major customers developing their own AI chips, including GPUs, they have stated they will continue purchasing NVDA GPUs. Collectively, the six companies listed above plan to invest around $300 billion in 2025 to expand their advanced technology platforms.
That said, the scale of these investments is becoming increasingly material. Take Microsoft for example. Its Azure platform is the second-ranked cloud computing provider behind only Amazon Web Services (AWS). In 2021, Microsoft spent just 27% of its operating cash flow on capital expenditures. Over the next year the company plans to spend $66.3 billion or nearly half of its estimated cash flow on AI infrastructure. Such a significant investment will certainly have investors paying close attention as they will expect the company to start delivering meaningful returns on these expenditures.
Adding even more pressure on NVDA’s key customers is the rapid advancement of AI from companies like DeepSeek in China. DeepSeek has developed AI models that achieve efficiency and performance similar to ChatGPT but at a fraction of the cost. While this breakthrough should benefit end users, as AI tools like ChatGPT may already be moving toward commoditization, it could push companies in the AI arena to reevaluate their capital spending strategies. If their investments fail to generate meaningful returns they may be forced to reconsider their approach.
The broadband overbuild of the early 2000s is a cautionary example of how massive investment doesn’t always lead to sustainable profitability. Between 1996 and 2000, annual investment in communications equipment surged from approximately $62 billion to over $135 billion (or ~$252 billion in 2024 dollars), only for the industry to lose a collective $50 billion between 1999 and 2001 as overcapacity crushed pricing and led to widespread bankruptcies.
Today, companies are pouring billions into AI infrastructure, but the pressure to deliver meaningful returns is mounting. While artificial intelligence is advancing rapidly, the leap from today’s models to true general intelligence, widespread automation, and broad adoption may still be a long road.
The AI revolution has propelled Nvidia to new heights, solidifying its role as the backbone of AI infrastructure. With surging demand, triple-digit revenue growth, and industry-leading margins, NVDA’s dominance in high-performance computing is undeniable. However, questions remain about the pace of AI adoption, the future competitive landscape, and the long-term viability and profitability of the end applications.
At GHPIA, we recognize the potential of AI and have exposure to companies in this space. However, we remain committed to our “middle lane” approach, seeking growth at a reasonable price while carefully assessing risks such as customer concentration and the sustainability of profit margins in an increasingly uncertain and evolving AI landscape.