Here's What Nvidia Investors Can Look Forward to in 2026

Dow Jones
12/24

Nvidia's stock has been under pressure in recent weeks from growing fears around artificial intelligence spending and emerging threats to the company's dominance. These questions will undoubtedly follow the chip maker into the new year, but Wall Street is seemingly confident in what's ahead as the AI buildout continues.

Going into next year, NVIDIA will still be fighting the bearish narrative that has emerged from the skepticism around AI spending, John Belton, a portfolio manager at Gabelli Funds, told MarketWatch.

The stock, which had lost as much as 17.4% last week from its Oct. 29 record closed of $207.04, has recovered slightly but was still 8.6% below the record through Tuesday. While the stock has rallied 40.9% in 2025, that's less than the Philadelphia Semiconductor Index's advance of 44.3% and marks a sharp deceleration from the 171.2% run up last year.

However, the market is still in an environment of strong demand for compute power, Jed Ellerbroek, a portfolio manager at Argent Capital Management, told MarketWatch. Nvidia is one of a handful of companies supplying most of the market, he said, therefore its supply chain has expanded significantly in recent years.

In the coming year, Ellerbroek said he expects demand for compute to remain ahead of supply. He also doesn't see a worrisome AI bubble forming just yet.

Ellerbroek pointed to shortages in the supply chain as one reason.

"Obviously power in the U.S. is in a shortage, and memory is now also in a shortage," he said. "The bounce in those stocks over the last couple of months has just been incredible." He added that he doesn't see shortages in these sectors turning into excess in the near term.

Another reason he doesn't see a bubble is because Taiwan Semiconductor Manufacturing, which manufactures most of the world's advanced chips, is expanding capacity conservatively. That means the company will continue to have pricing power, which also grants pricing power to customers like Nvidia and Apple, Ellerbroek said.

Belton said supply and demand imbalance in the memory market will likely continue in the foreseeable future, but he sees improvements there starting in the back half of next year. One positive sign from Nvidia's earnings report in November, he said, is that it's guiding for gross margins to be in the mid-seventies percentage range in the coming year.

The fact that Nvidia feels comfortable with that gross-margin range shows the company has been preparing for a tight memory market and has locked in good pricing, Belton said.

Meanwhile, Ellerbroek is looking forward to seeing what capabilities new AI models will have after being trained on Nvidia's latest Blackwell chips and on its coming Rubin chips. The next version of OpenAI's GPT large language model is expected to come out in the first few months of next year, and Ellerbroek noted that it will likely be among the first Blackwell-trained AI models on the market.

"We need to see a big step up in the quality of these models and what they can do and how they can be rolled into products and impact businesses and consumers," Ellerbroek said.

Belton also thinks OpenAI's release of its new GPT model and following performance reviews will be important for Nvidia.

"A very successful launch of the new OpenAI model could say a lot about where we are in the evolution of AI technology, and whether scaling laws are still intact," Belton said, referring to the rules that show how an AI system's performance improves with increasing resources such as training data and compute power. "Assuming it's successful, it could point to the lead that Nvidia still has."

He compared his expected market reaction to OpenAI's next GPT model to the recent release of Google's Gemini 3 AI model. Google spent much of the year viewed as an AI laggard, but its Gemini 3 launch in November sparked enthusiasm among investors that the company is actually not far behind. The tech giant said its Gemini 3 model outperformed OpenAI's previous GPT-5.1 model and Anthropic's Claude Sonnet 4.5 on a series of benchmarks.

Gemini 3 was trained on Google's custom chips co-developed with Broadcom, which raised questions in the market over the competitive advantage of Nvidia's graphics processing units over application-specific integrated circuits, or ASICs, which are custom-designed to carry out specific tasks.

But in Belton's view, the reaction from the market over Gemini 3 was less about Nvidia's chips competing with ASICs like Google's TPU, and more about Google's Gemini versus OpenAI's ChatGPT.

"I think there was this realization that OpenAI is not running away with this market like many have thought," Belton said. "That has implications for OpenAI suppliers which include Nvidia."

However, Belton said he doesn't see custom silicon programs being a real competitive threat to Nvidia due to the difficulty of setting up a chip program and how specific each company's chip is to its systems. Merchant silicon from Nvidia is general-purpose, and therefore works well in many different data centers.

"I think Nvidia is always going to be the first solution for any customer," Belton said. "Everything else is going to be for a small number of customers." He added that he believes the best alternative to the latest-generation Nvidia chip will always be its predecessor.

In the coming months, Belton said the market will be closely watching the progress of construction for OpenAI's next Stargate data centers. Through OpenAI, Belton said the market will get important data points around the financial dynamics of the AI data center buildout. One positive metric would be an update to OpenAI's 2026 annual recurring revenue targets, he said.

Meanwhile, OpenAI is expecting to raise up to $100 billion in its next fundraising round, which could value the company at $830 billion if it meets its full target, the Wall Street Journal reported. The funding is reportedly expected to be completed by the end of the first quarter.

"As long as you have all of those metrics heading in the right direction, any long-term investor is going to be fine and feel comfortable with any sort of delays or lack thereof" in the AI infrastructure buildout, Belton said.

In 2026, investors should be looking at whether OpenAI will turn out to be a good ecosystem partner, he said, or if it's an "albatross in the making and something that's going to cause some serious overbuilding and stress in the ecosystem for years to come."

Paul Meeks, managing director at Freedom Capital Markets, told MarketWatch that one area he'll be watching closely next year is the on-again, off-again export of Nvidia's GPUs to China. Earlier this week, Reuters reported that Nvidia has told Chinese customers that it plans to start shipping its H200 chips to the country in February.

The shipments could have between 40,000 to 80,000 of the advanced chips from the company's existing stockpile, Reuters reported, citing unnamed people familiar with the matter. The H200 has been used to train powerful AI models from companies including OpenAI and Meta Platforms, and is far more capable than the H20 chip Nvidia previously designed for the Chinese market to meet U.S. export control requirements.

If the chip maker is allowed to ship its chips to China with lower restrictions, "its sales in 2026 may rise 15% above [Wall] Street expectations," Meeks told MarketWatch in emailed comments.

應版權方要求,你需要登入查看該內容

免責聲明:投資有風險,本文並非投資建議,以上內容不應被視為任何金融產品的購買或出售要約、建議或邀請,作者或其他用戶的任何相關討論、評論或帖子也不應被視為此類內容。本文僅供一般參考,不考慮您的個人投資目標、財務狀況或需求。TTM對信息的準確性和完整性不承擔任何責任或保證,投資者應自行研究並在投資前尋求專業建議。

熱議股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10