Cisco unveils new AI networking chip, taking on Broadcom and Nvidia

Reuters
02/10
Cisco unveils new AI networking chip, taking on Broadcom and <a href="https://laohu8.com/S/NVDA">Nvidia</a> 

By Stephen Nellis

SAN FRANCISCO, Feb 10 (Reuters) - Cisco Systems CSCO.O on Tuesday launched a new chip and router designed to speed information through massive data centers that will compete against offerings from Broadcom AVGO.O and Nvidia NVDA.O for a piece of the $600 billion AI infrastructure spending boom.

Cisco said its Silicon One G300 switch chip, expected to go on sale in the second half of the year, will help the chips that train and deliver AI systems talk to each other over hundreds of thousands of links.

The chip will be made with Taiwan Semiconductor Manufacturing Co's 2330.TW 3-nanometer chipmaking technology and will have several new "shock absorber" features designed to help networks of AI chips from bogging down when hit with large spikes of data traffic, Martin Lund, executive vice president of Cisco's common hardware group, told Reuters in an interview.

Cisco expects the chip to help some AI computing jobs get done 28% faster, in part by re-routing data around any problems in the network automatically, within microseconds.

"This happens when you have tens of thousands, hundreds of thousands of connections - it happens quite regularly," Lund said. "We focus on the total end-to-end efficiency of the network."

Networking has become a key competitive field in AI. When Nvidia unveiled its newest systems last month, one of the six key chips in the system was a networking chip that competes with Cisco's offerings. Broadcom is going after the same market with its "Tomahawk" series of chips.

(Reporting by Stephen Nellis in San Francisco; Editing by Jamie Freed)

((stephen.nellis@thomsonreuters.com))

应版权方要求,你需要登录查看该内容

免责声明:投资有风险,本文并非投资建议,以上内容不应被视为任何金融产品的购买或出售要约、建议或邀请,作者或其他用户的任何相关讨论、评论或帖子也不应被视为此类内容。本文仅供一般参考,不考虑您的个人投资目标、财务状况或需求。TTM对信息的准确性和完整性不承担任何责任或保证,投资者应自行研究并在投资前寻求专业建议。

热议股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10