Nvidia Restarting Manufacturing Of China AI Chip Variant, CEO Says

Reuters
昨天

SAN JOSE, California, March 17 (Reuters) - Nvidia NVDA.O is restarting manufacturing of one of the company’s chips that is designed to comply with U.S. export restrictions on China, CEO Jensen Huang said at a press conference on Tuesday.

The company had halted production last year of its H200 chip, which is based on its aging Hopper technology, because of increasing regulatory hurdles in the U.S. and China, according to a report at the time.

Since then, Nvidia has received licenses to export the H200 from the U.S. government and has taken orders, Huang said. This led Nvidia to begin restarting its manufacturing several weeks ago.

“Our supply chain is getting fired up,” Huang said.

The China chip sales are not included in the forecast for more than $1 trillion in revenue that Huang made for the company's Blackwell and Rubin AI chips by the end of 2027.

Blackwell and Rubin are Nvidia's flagship AI chips and are capable of building the large language models that underpin chatbots such as OpenAI's ChatGPT. Blackwell chips are available for purchase, while Rubin chips are Nvidia's next-generation processors and are in full production.

The $1 trillion estimate Huang issued does not include a swath of the company's other products such as its central processing units, its range of networking chips or the forthcoming chips based on the technology it licensed from Groq. The estimate also does not include a Rubin variant known as Rubin Ultra.

In December, Nvidia signed a deal to license Groq's tech and hired many of the startup's executives.

应版权方要求,你需要登录查看该内容

免责声明:投资有风险,本文并非投资建议,以上内容不应被视为任何金融产品的购买或出售要约、建议或邀请,作者或其他用户的任何相关讨论、评论或帖子也不应被视为此类内容。本文仅供一般参考,不考虑您的个人投资目标、财务状况或需求。TTM对信息的准确性和完整性不承担任何责任或保证,投资者应自行研究并在投资前寻求专业建议。

热议股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10