Micron Technology's First-Mover Advantage Severely Hit! NVIDIA Halts First-Generation SOCAMM Memory Development: Shifting to SOCAMM2

Deep News
2025/09/15

According to Korean media reports, NVIDIA has discontinued the development of first-generation SOCAMM memory and shifted its R&D focus to SOCAMM2. SOCAMM was originally positioned as modular LPDDR memory, designed to provide high bandwidth and low power consumption advantages similar to HBM in AI servers. However, due to technical development obstacles, the plan has been forced to reset.

Previously, NVIDIA had already listed SOCAMM1 support for up to 18TB LPDDR5X capacity and 14.3TB/s bandwidth in its GB300 NVL72 specifications, positioning it as an alternative solution for data center computing.

Reports indicate that SOCAMM2 upgrades include increasing speeds from 8533MT/s to 9600MT/s and potentially supporting LPDDR6.

In terms of manufacturers, Micron Technology announced mass production of SOCAMM1 in March this year, becoming the first memory supplier to introduce the product into AI servers. In contrast, Samsung and SK Hynix only disclosed in earnings conference calls that they plan to begin mass production in the third quarter of 2025. If SOCAMM1 is confirmed to be discontinued, this would allow competitors to narrow the gap with Micron Technology.

The market previously estimated that SOCAMM shipments could reach 600,000 to 800,000 units in 2025, highlighting NVIDIA's original intention for rapid adoption. This direct transition from SOCAMM1 to SOCAMM2 may indicate that NVIDIA is accelerating upgrades to maintain its leading position in AI competition.

免责声明:投资有风险,本文并非投资建议,以上内容不应被视为任何金融产品的购买或出售要约、建议或邀请,作者或其他用户的任何相关讨论、评论或帖子也不应被视为此类内容。本文仅供一般参考,不考虑您的个人投资目标、财务状况或需求。TTM对信息的准确性和完整性不承担任何责任或保证,投资者应自行研究并在投资前寻求专业建议。

热议股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10