Cisco Unveils Advanced AI Networking Chip to Challenge Broadcom and NVIDIA

Deep News
02/10

Cisco Systems introduced a new chip and router on Tuesday, designed to accelerate data transmission in large-scale data centers. This move positions the company to compete with Broadcom and NVIDIA in the booming artificial intelligence infrastructure market, valued at $600 billion.

Cisco stated that its Silicon One G300 switching chip is expected to launch in the second half of this year. The chip is engineered to efficiently interconnect various chips used for training and operating AI systems across hundreds of thousands of links.

In an interview, Martin Lund, Executive Vice President of Cisco's Common Hardware Group, revealed that the chip will be manufactured using TSMC's 3-nanometer process technology. It incorporates new "buffering and shock absorption" features aimed at preventing congestion and latency in AI chip networks during sudden, massive surges in data traffic.

Cisco projects that the chip can accelerate the completion of certain AI computing tasks by up to 28%. This performance gain is partly attributed to its ability to automatically bypass network failures and reroute data within microseconds.

"When you have tens of thousands, even hundreds of thousands of connections, this scenario occurs frequently," Lund said. "We are focused on the end-to-end overall efficiency of the network."

Networking technology has become a critical battleground in the AI sector. Last month, when NVIDIA unveiled its latest system, one of its six key chips was a networking component that directly competes with Cisco's offerings. Broadcom is also contending for the same market segment with its Tomahawk series of chips.

免責聲明:投資有風險,本文並非投資建議,以上內容不應被視為任何金融產品的購買或出售要約、建議或邀請,作者或其他用戶的任何相關討論、評論或帖子也不應被視為此類內容。本文僅供一般參考,不考慮您的個人投資目標、財務狀況或需求。TTM對信息的準確性和完整性不承擔任何責任或保證,投資者應自行研究並在投資前尋求專業建議。

熱議股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10