Cisco Unveils Advanced AI Networking Chip to Challenge Broadcom and NVIDIA

Deep News
02/10

Cisco Systems introduced a new chip and router on Tuesday, designed to accelerate data transmission in large-scale data centers. This move positions the company to compete with Broadcom and NVIDIA in the booming artificial intelligence infrastructure market, valued at $600 billion.

Cisco stated that its Silicon One G300 switching chip is expected to launch in the second half of this year. The chip is engineered to efficiently interconnect various chips used for training and operating AI systems across hundreds of thousands of links.

In an interview, Martin Lund, Executive Vice President of Cisco's Common Hardware Group, revealed that the chip will be manufactured using TSMC's 3-nanometer process technology. It incorporates new "buffering and shock absorption" features aimed at preventing congestion and latency in AI chip networks during sudden, massive surges in data traffic.

Cisco projects that the chip can accelerate the completion of certain AI computing tasks by up to 28%. This performance gain is partly attributed to its ability to automatically bypass network failures and reroute data within microseconds.

"When you have tens of thousands, even hundreds of thousands of connections, this scenario occurs frequently," Lund said. "We are focused on the end-to-end overall efficiency of the network."

Networking technology has become a critical battleground in the AI sector. Last month, when NVIDIA unveiled its latest system, one of its six key chips was a networking component that directly competes with Cisco's offerings. Broadcom is also contending for the same market segment with its Tomahawk series of chips.

免责声明:投资有风险,本文并非投资建议,以上内容不应被视为任何金融产品的购买或出售要约、建议或邀请,作者或其他用户的任何相关讨论、评论或帖子也不应被视为此类内容。本文仅供一般参考,不考虑您的个人投资目标、财务状况或需求。TTM对信息的准确性和完整性不承担任何责任或保证,投资者应自行研究并在投资前寻求专业建议。

热议股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10