Cisco unveils new AI networking chip, taking on Broadcom and Nvidia

Reuters
02/10
Cisco unveils new AI networking chip, taking on Broadcom and <a href="https://laohu8.com/S/NVDA">Nvidia</a> 

By Stephen Nellis

SAN FRANCISCO, Feb 10 (Reuters) - Cisco Systems CSCO.O on Tuesday launched a new chip and router designed to speed information through massive data centers that will compete against offerings from Broadcom AVGO.O and Nvidia NVDA.O for a piece of the $600 billion AI infrastructure spending boom.

Cisco said its Silicon One G300 switch chip, expected to go on sale in the second half of the year, will help the chips that train and deliver AI systems talk to each other over hundreds of thousands of links.

The chip will be made with Taiwan Semiconductor Manufacturing Co's 2330.TW 3-nanometer chipmaking technology and will have several new "shock absorber" features designed to help networks of AI chips from bogging down when hit with large spikes of data traffic, Martin Lund, executive vice president of Cisco's common hardware group, told Reuters in an interview.

Cisco expects the chip to help some AI computing jobs get done 28% faster, in part by re-routing data around any problems in the network automatically, within microseconds.

"This happens when you have tens of thousands, hundreds of thousands of connections - it happens quite regularly," Lund said. "We focus on the total end-to-end efficiency of the network."

Networking has become a key competitive field in AI. When Nvidia unveiled its newest systems last month, one of the six key chips in the system was a networking chip that competes with Cisco's offerings. Broadcom is going after the same market with its "Tomahawk" series of chips.

(Reporting by Stephen Nellis in San Francisco; Editing by Jamie Freed)

((stephen.nellis@thomsonreuters.com))

應版權方要求,你需要登入查看該內容

免責聲明:投資有風險,本文並非投資建議,以上內容不應被視為任何金融產品的購買或出售要約、建議或邀請,作者或其他用戶的任何相關討論、評論或帖子也不應被視為此類內容。本文僅供一般參考,不考慮您的個人投資目標、財務狀況或需求。TTM對信息的準確性和完整性不承擔任何責任或保證,投資者應自行研究並在投資前尋求專業建議。

熱議股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10