Micron Technology's First-Mover Advantage Severely Hit! NVIDIA Halts First-Generation SOCAMM Memory Development: Shifting to SOCAMM2

Deep News
2025/09/15

According to Korean media reports, NVIDIA has discontinued the development of first-generation SOCAMM memory and shifted its R&D focus to SOCAMM2. SOCAMM was originally positioned as modular LPDDR memory, designed to provide high bandwidth and low power consumption advantages similar to HBM in AI servers. However, due to technical development obstacles, the plan has been forced to reset.

Previously, NVIDIA had already listed SOCAMM1 support for up to 18TB LPDDR5X capacity and 14.3TB/s bandwidth in its GB300 NVL72 specifications, positioning it as an alternative solution for data center computing.

Reports indicate that SOCAMM2 upgrades include increasing speeds from 8533MT/s to 9600MT/s and potentially supporting LPDDR6.

In terms of manufacturers, Micron Technology announced mass production of SOCAMM1 in March this year, becoming the first memory supplier to introduce the product into AI servers. In contrast, Samsung and SK Hynix only disclosed in earnings conference calls that they plan to begin mass production in the third quarter of 2025. If SOCAMM1 is confirmed to be discontinued, this would allow competitors to narrow the gap with Micron Technology.

The market previously estimated that SOCAMM shipments could reach 600,000 to 800,000 units in 2025, highlighting NVIDIA's original intention for rapid adoption. This direct transition from SOCAMM1 to SOCAMM2 may indicate that NVIDIA is accelerating upgrades to maintain its leading position in AI competition.

免責聲明:投資有風險,本文並非投資建議,以上內容不應被視為任何金融產品的購買或出售要約、建議或邀請,作者或其他用戶的任何相關討論、評論或帖子也不應被視為此類內容。本文僅供一般參考,不考慮您的個人投資目標、財務狀況或需求。TTM對信息的準確性和完整性不承擔任何責任或保證,投資者應自行研究並在投資前尋求專業建議。

熱議股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10