At its GTC conference, Nvidia(NASDAQ: NVDA) gave investors 1 trillion potential reasons to buy its stock. That came in the form of CEO Jensen Huang projecting that data center infrastructure capital expenditure (capex) would hit $1 trillion or more by 2028.
Investors, nonetheless, largely shrugged off the robust forecast and other upbeat news from the event. That said, if Nvidia’s projections come to fruition, the stock has a lot more upside from here.
$1 trillion in data center infrastructure capex by 2028 would be a continued acceleration of spending in the space, which would be great news for Nvidia. The company’s graphics processing units (GPUs) have become the backbone of the artificial intelligence (AI) infrastructure buildout, due to their powerful data processing abilities and ease of use.
In a chart from the presentation, Nvidia estimated 2024 data center infrastructure spending to be around $400 billion in 2024. For its past fiscal year (fiscal year 2025 ended in January), the company produced total revenue of $130.5 billion, of which $115.2 billion was from its data center segment. Meanwhile, research company Dell’Oro Group just estimated that 2024 data center infrastructure spending reached $455 billion. That translates into Nvidia currently capturing around 25% to 30% of this spending.
If Nvidia was able to keep its current share of this spending, that would translate into between $250 billion to $300 billion in data center infrastructure revenue alone in 2028. The company plans to continue to lead the way with both its chips and its software. It introduced the new Blackwell Ultra GPU at the event, which will begin shipping in the second half of this year. The new Blackwell chips are more powerful, making them great for more time-sensitive services. Nvidia predicted Blackwell revenue would be much greater than the revenue it generated from its earlier Hopper architecture.
Continuing with its chip innovation, the company is also set to introduce its new Vera Rubin chip, which will combine a GPU with its next-generation Rubin architecture and a custom-designed central processing unit (CPU), using Arm‘s technology. It said the CPU will be twice as fast as the off-the-shelf one used in its earlier Grace Blackwell chips. Meanwhile, it will look to increase the number of GPU dies in its current Blackwell chips from two to four with the “Rubin Next” chip that it plans to launch in the second half of 2027.
Nvidia isn’t just innovating on the hardware side. It also revealed a new open-source software system called Nvidia Dynamo that will help increase inference throughput and reduce costs. The company said the new software will help orchestrate and accelerate inference communication across thousands of GPUs. It said that Dynamo is not just an operating system for a data center, but for an entire AI factory.
Nvidia doesn’t just have its sights set on data centers, though. It’s looking to tackle the robotics and autonomous driving markets as well. Huang proclaimed that “the age of generalist robotics is here” with the introduction of Isaac GROOT N1, which he called the world’s first “open Humanoid Robot foundation model.” The model can be trained on real or synthetic data to help humanoid robots master tasks. The company thinks these robots will be able to fill menial labor jobs and help with a global 50-million-job shortage.
The company will also team up with General Motors to help the automaker develop its own autonomous driving system. The move is somewhat surprising, since GM scrapped its prior attempt at a robotaxi business last year. The unit became mired in controversy when one of its Cruise robotaxis dragged a pedestrian down the road after the person was originally hit by another vehicle.
Nvidia said that in addition to supplying GPUs, it will help GM build custom AI systems. GM will also use Nvidia’s GPUs and software to train AI manufacturing models in order to build next-generation factory robots. This follows Nvidia striking a deal with Toyota last month to provide chips and software to help run its advanced driver-assistance features.
Image source: Getty Images.
While Nvidia has been the biggest winner of the AI infrastructure buildout, it still has a very large opportunity in front of it. AI infrastructure spending is still increasing, and Nvidia is not resting on its laurels. It continues to drive innovation and is looking to make sure it’s the winner in AI inference, not just AI training. Meanwhile, it’s looking for growth beyond the data center into other large potential markets.
At the same time, Nvidia’s stock remains attractively valued following the recent market sell-off. The stock trades at a forward price-to-earnings (P/E) ratio of under 26 times this year’s analyst estimates and a price/earnings-to-growth (PEG) below 0.5. A PEG of 1 is typically the threshold for a stock being considered undervalued, and Nvidia’s multiple is way below this mark.
As such, Nvidia looks like a solid long-term buy at these levels.
Before you buy stock in Nvidia, consider this:
The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Nvidia wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.
Consider when Nvidia made this list on April 15, 2005… if you invested $1,000 at the time of our recommendation, you’d have $721,394!*
Now, it’s worth notingStock Advisor’s total average return is839% — a market-crushing outperformance compared to164%for the S&P 500. Don’t miss out on the latest top 10 list, available when you joinStock Advisor.
Geoffrey Seiler has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Nvidia. The Motley Fool recommends General Motors. The Motley Fool has a disclosure policy.