Broadcom (AVGO.US) Launches Tomahawk Ultra Network Chip to Challenge Nvidia

Stock News
2025/07/16

Broadcom Inc. (AVGO.US) has initiated mass production of its groundbreaking Tomahawk Ultra network processor, engineered specifically for high-performance computing and artificial intelligence applications. This advanced silicon solution targets tightly coupled, low-latency communication patterns prevalent in HPC systems and AI clusters. Leveraging ultra-low latency switching technology and adaptive Ethernet header optimization, the chip delivers predictable, high-efficiency performance for large-scale simulations, scientific computations, and synchronized AI model training and inference.

Senior Vice President Ram Velaga revealed the processor directly competes with Nvidia's (NVDA.US) NVLink switch chips while quadrupling chip density. Unlike proprietary alternatives, Tomahawk Ultra harnesses enhanced Ethernet protocols. Both companies' technologies enable data center manufacturers to densely pack chips within confined spaces – an industry technique termed "scale-out" computing.

Taiwan Semiconductor Manufacturing Company (TSM.US) will fabricate the Ultra-series processors using its 5-nanometer process technology. Concurrently, Broadcom introduced SUE-Lite, an optimized variant of its Scalable Unified Ethernet specification tailored for power-and-area-sensitive accelerator applications.

The new series maintains full compatibility with Tomahawk 5 products through identical interfaces, ensuring rapid market deployment. Initial shipments have commenced for rack-scale AI training clusters and supercomputing environments.

免責聲明:投資有風險,本文並非投資建議,以上內容不應被視為任何金融產品的購買或出售要約、建議或邀請,作者或其他用戶的任何相關討論、評論或帖子也不應被視為此類內容。本文僅供一般參考,不考慮您的個人投資目標、財務狀況或需求。TTM對信息的準確性和完整性不承擔任何責任或保證,投資者應自行研究並在投資前尋求專業建議。

熱議股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10