SK Hynix Bets on "AI Memory Super Cycle," Plans 8-Fold Increase in 10nm DRAM Output Next Year

Deep News
Nov 20, 2025

SK Hynix is aggressively expanding production capacity for advanced memory chips, betting on market opportunities driven by AI applications shifting from training to inference.

According to South Korean media reports on November 20, the memory chip giant plans to increase monthly production of its sixth-generation 10nm DRAM (1c DRAM) from the current 20,000 300mm wafers to 160,000–190,000 next year—an 8- to 9-fold surge. This would account for over one-third of its total DRAM capacity.

Industry sources indicate the expanded 1c DRAM output will primarily supply GDDR7 and SOCAMM2 products to meet demand from major tech firms like Nvidia. This strategic shift reflects surging demand for cost-effective general-purpose DRAM in AI inference applications, as the company broadens its focus beyond high-bandwidth memory (HBM) to the wider AI memory market.

Recent reports reveal SK Hynix successfully negotiated over 50% price increases for HBM4 with Nvidia, securing rates above $500 per unit. The company has reportedly sold out next year's production capacity in advance, positioning itself favorably in both HBM and general DRAM markets.

Analysts project SK Hynix's facility investments will easily surpass 30 trillion won next year, up significantly from this year's estimated 25 trillion won. Market expectations suggest the company could achieve record operating profits exceeding 70 trillion won in 2025.

The capacity expansion focuses on cutting-edge 1c DRAM technology. Sources say SK Hynix plans to add 140,000 wafers monthly through process upgrades at its Icheon campus—considered the "minimum increase." Some industry observers suggest the company may boost output by 160,000–170,000 wafers.

With current monthly DRAM wafer input averaging 500,000 units, over one-third will be allocated to advanced 1c DRAM production. The company has achieved over 80% yield rates for 1c DRAM, used primarily in DDR5, LPDDR, and GDDR7 products.

This aggressive expansion reflects SK Hynix's confidence in sustained AI-driven memory demand. Analysts note that compared to HBM's complex stacking process, 1c DRAM offers higher production efficiency to meet explosive market demand.

The strategic pivot responds to shifting AI application trends. While HBM was previously prioritized, general DRAM demand is now expected to match HBM growth as AI models expand into inference applications. Energy-efficient advanced DRAM has become the mainstream choice for AI inference.

Nvidia's new Rubin CPX AI accelerator uses GDDR memory instead of HBM, deployed directly alongside processors. Tech giants like Google, OpenAI, and AWS are also developing custom AI accelerators integrating large amounts of general DRAM.

Nvidia's SOCAMM2 memory modules also utilize 1c DRAM. The SOCAMM standard offers lower bandwidth but higher energy efficiency than HBM. Industry watchers predict SK Hynix will secure supply orders for SOCAMM2 modules deployed with Nvidia's Vera CPU.

DRAMeXchange data shows DDR4 fixed transaction prices exceeded $7 in September, hitting a 6-year, 10-month high. This reflects supply bottlenecks as chipmakers prioritized HBM production in recent years.

SK Hynix's successful 50%+ price increase for HBM4—slated for Nvidia's Rubin AI chips launching late next year—reflects technological advancements. The new HBM4 doubles data channels to 2,048 compared to HBM3E and incorporates new logic processes for computational efficiency and power management.

After initial resistance, Nvidia accepted the price hike considering SK Hynix's technological lead. The company began supplying initial HBM4 batches in June after delivering the world's first 12-layer stack samples in March.

While expanding general DRAM capacity, SK Hynix continues advancing its HBM roadmap. Its new M15X facility in Cheongju, set for year-end operation, will produce 60,000 wafers monthly of 1b DRAM for HBM4.

With estimated 60% margins on HBM4, analysts project 40–42 trillion won in HBM sales next year. Maintaining current margins would generate 25 trillion won in HBM operating profit—a 50% increase from 2024.

As AI infrastructure investments drive up general DRAM prices, SK Hynix could achieve 50–60% operating margins in this segment. Having sold out next year's capacity in advance, the company is positioned to maintain high profitability.

Combining HBM4's premium pricing and general DRAM price surges, market watchers forecast SK Hynix could surpass 70 trillion won in operating profits next year—a historic high—supported by secured HBM4 margins, locked-in capacity, and robust AI memory demand.

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Most Discussed

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10