Nvidia's Jensen Huang Used To Be The Stock's Best Salesman. Now He Can Barely Make It Budge

Dow Jones
昨天

The stock reaction wasn't quite what Nvidia CEO Jensen Huang was hoping for after a keynote speech. Nvidia shares were down 0.7% on Tuesday after gaining just 1.7% the previous day.

Admittedly the day after it's last GTC conference in October 2025 the stock closed down 2%, according to Dow Jones Market Data, but after the two events prior to that it closed up 3.15% in March 2025, and up 3.12% in March 2024.

Huang has been struggling to reignite Nvidia's rally amid a difficult environment for technology stocks, even after impressive hardware reveals.

The stock has largely been stuck in a range of $180-$190 since last summer. Initial concerns about the sustainability of spending on artificial-intelligence infrastructure which reared their heads late last year have now been compounded by market concerns about the Iran conflict, reducing expectations for rate cuts and raising fears of recession.

That's a tough environment for the world's largest publicly traded company to gain any momentum and the company's GTC conference this week doesn't look like it will change that.

The biggest headline amid a flurry of announcements on Monday was that Huang said the company expects to sell at least $1 trillion worth of Blackwell and Rubin chips for the period from 2025 through to the end of 2027. He had previously forecast a revenue opportunity of $500 billion by the end of 2026.

Importantly, Nvidia executives clarified the estimate was only for the Blackwell and Rubin processors, suggesting its overall data-center revenue -- which includes other products -- could be ahead of Wall Street expectations of close to $1 trillion for the same period.

Still, analysts were cautious about the extent to which Huang's comments will move consensus forecasts.

"We felt Jensen touched on the key points being discussed by the investor community, with the updated backlog validating rather than raising current estimates," Stifel analyst Ruben Roy wrote in a research note.

Roy has a Buy rating and $250 target price on the stock. That's based on a price-to-earnings multiple of 25 times his forecast for Nvidia's earnings in 2027. Nvidia currently trades at a forward earnings multiple of less than 22 times, according to FactSet.

As expected, Nvidia unveiled a new hardware system aimed at inference, generating output from AI models. The Nvidia Groq 3 LPX rack combines 72 of Nvidia's next-generation Vera Rubin servers with 256 chips called language processing units (LPUs), developed by start-up Groq. Nvidia is on track to deliver Vera Rubin systems in the second half of 2026, Huang said.

The new inference system should help Nvidia fend off the growing challenge from custom chips such as Google's Tensor Processing Units, which were designed in collaboration with Broadcom, which offer cost-effective inference. Analysts at Mizuho estimate that currently between 20% and 40% of AI workloads are dedicated to inference, and that will grow to between 60% and 80% over the next five years.

But independent analyst Richard Windsor, who publishes Radio Free Mobile, argues that better inference will only be a benefit if the companies building data centers can sustainably generate more revenue from their pricey Nvidia hardware.

"Given that Nvidia is clearly leading this sector, it will also cement its dominance in inference as well as training," Windsor wrote. "However, [Nvidia's current-generation] Blackwell was supposed to deliver similar economic benefits, but intense competition meant that the price of tokens fell, resulting in revenue remaining static."

Still, with the GTC conference continuing until Thursday, expect plenty more news as Nvidia looks to tighten its grip on the AI chip throne.

应版权方要求,你需要登录查看该内容

免责声明:投资有风险,本文并非投资建议,以上内容不应被视为任何金融产品的购买或出售要约、建议或邀请,作者或其他用户的任何相关讨论、评论或帖子也不应被视为此类内容。本文仅供一般参考,不考虑您的个人投资目标、财务状况或需求。TTM对信息的准确性和完整性不承担任何责任或保证,投资者应自行研究并在投资前寻求专业建议。

热议股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10