Surprise or Not! Micron Technology (MU.US), with a 180% Year-to-Date Surge, is a Rare "Value Play" Amid the Memory Super Cycle

Stock News
05/12

Micron Technology (MU.US), the U.S.-based memory chip manufacturing giant, has become one of the most critical players in AI computing infrastructure. The core reason lies in the insatiable demand from hyperscale cloud providers for high-bandwidth memory (HBM) capacity, a high-performance memory chip that only SK Hynix, Samsung Electronics, and Micron can produce at scale globally. This "memory chip super cycle" is unique because the supply shortage is more structural and less transient than any previous demand boom. Beginning in the second half of 2025, this cycle has provided a significant opportunity for Micron to transform from a typical cyclical commodity producer into a strategic AI infrastructure supplier, markedly enhancing its earnings visibility and business durability. Crucially, compared to other global AI infrastructure suppliers like NVIDIA, AMD, SanDisk, Western Digital, Seagate, and Intel, Micron's valuation appears considerably cheaper, making it more attractive for long-term investment by institutional investors relative to other North American AI ecosystem leaders. AI is changing the rules of memory demand. The global frenzy for AI data center construction has created near-limitless demand for memory chips. For much of the period since 2023, AI GPUs/ASICs, high-performance networking, and data center power systems dominated the first two phases of AI infrastructure build-out. Micron's management recently highlighted that future HBM for AI training/inference systems, high-performance DRAM, and SSD capacity will become the most critical supply bottlenecks. Modern AI infrastructure, due to simultaneous advancements in inference models, algorithms, robotic physical AI, and longer context windows, creates immense demand for bandwidth, memory, and storage capacity. In its latest earnings call, Micron's management consistently emphasized that current AI computing demand far outstrips supply, with significant new memory chip capacity not expected to come online effectively until 2028. This provides a historically strong precondition for maintaining high memory prices and capacity utilization, likely for a much longer duration than previous cycles. Demand related to AI infrastructure is less cyclical compared to traditional PC or smartphone demand. Hyperscale cloud providers are no longer purchasing memory components merely to expand cloud infrastructure but are aggressively expanding cloud-based AI inference resources at the fastest possible speed. Possessing the best AI infrastructure confers a major competitive advantage in cloud services, AI application software, and even national security services in the cloud. This trend has significantly altered the procurement process. Micron's management has repeatedly stressed in recent interviews that supply for DRAM and NAND memory chips remains far below demand, with new capacity unlikely to materialize before fiscal 2028, a critical point. HBM manufacturing involves unique processes, including the industry's most complex advanced packaging, through-silicon vias (TSVs), and yield challenges. Strict cleanroom limitations and higher green energy efficiency requirements hinder rapid capacity expansion to meet soaring prices. Traditionally, supply would quickly increase when prices became highly favorable for memory makers. However, the industry now faces numerous structural constraints—especially as HBM, with its extremely complex manufacturing and packaging, consumes more capacity, while the supply elasticity for general-purpose DRAM/NAND remains insufficient, and AI-driven demand growth continues to exceed expectations. Without sufficient memory, AI models must recompute from scratch! HBM, DDR5, and SSD Jointly Drive the Memory Super Cycle In its earnings call, Micron's management specifically mentioned booming demand for high-capacity data center SSDs for AI infrastructure, KV cache deployments, and PCIe Gen6 SSDs related to NVIDIA's AI computing clusters. This indicates that AI-related memory demand is far broader than many Wall Street analysts anticipated. Modern AI infrastructure not only consumes more HBM but also requires high-bandwidth DRAM, greater storage capacity, and high-speed SSD infrastructure to meet growing needs for retrieval and agentic AI workloads. Emerging AI applications, including robotics, multi-agent systems, and multimodal inference models, are continuously creating new demand vectors for memory, suggesting that AI storage intensity may continue to grow exponentially even after initial AI deployment. As South Korea's KOSPI index, heavily weighted with Samsung and SK Hynix, hits record highs with an 85% year-to-date surge amid geopolitical tensions, and the Taiwan stock market, led by TSMC—a major AI beneficiary dubbed the "king of chip foundries"—also reaches new highs, alongside a record 18-day winning streak for the Philadelphia Semiconductor Index and the S&P 500's six-week rally to repeated records, investors are increasingly convinced that the "AI computing investment theme" can drown out all market noise, especially related to Middle East geopolitics. As Jeremy Werner, Senior Vice President and General Manager of Micron's Data Center Business Unit, revealed in a recent interview, from the engineering logic of data flow processing in AI data centers, the underlying driver of this trend is not simply "AI needs more compute chips." Instead, the era of AI inference, led by agents like Claude Cowork and OpenClaw, is pushing memory/storage from a supporting component to a system bottleneck. AI training relies more on massive parallel computing, while inference—especially with long contexts, multi-turn conversations, and Agentic AI workflows—requires continuously saving KV Cache, context states, and intermediate results. When memory/storage is insufficient, models must recompute historical states, leading to lower GPU utilization and higher token generation costs. Therefore, HBM, DDR5, LPDDR, enterprise SSDs, and even HDDs/data lakes are forming an "AI memory chain" from GPU-proximate to remote storage, determining an AI system's throughput, latency, concurrency, and per-token economics. This explains the synchronized surge in memory and data storage stocks like Micron, Samsung, SK Hynix, SanDisk, and Western Digital: demand is not concentrated solely on HBM but is spilling over across the entire chain—DRAM, NAND, SSDs, and HDDs—along the AI server architecture. More critically, AI CPUs are opening a second demand curve. The market previously equated AI computing power almost exclusively with GPUs and HBM. However, as inference workloads become more complex, CPUs are evolving from "GPU supporting actors" to "AI coordinators" that schedule multiple agents, manage context, and orchestrate workflows. This will significantly boost demand for server DDR5 and data center-level SSD configurations. Meanwhile, HBM capacity is heavily locked by AI GPUs, squeezing available capacity for general-purpose DRAM, causing price divergence between DDR5 and DDR4. The memory shortage is spreading from high-end HBM to the broader DRAM/NAND supply chain. TrendForce also cited Micron's CEO's latest view, stating that both traditional server and AI server demand are strong but constrained by tight DRAM and NAND supply. Samsung and SK Hynix recently warned that AI-driven memory shortages could persist until 2028 or beyond. Micron's Valuation is Significantly Cheaper Than Most AI Ecosystem Leaders Regarding DRAM/NAND price increases, Wall Street giant Goldman Sachs' latest assessment is that the 2026 memory price hike will far exceed its previous optimistic forecasts. Goldman recently raised its DRAM price increase forecast from approximately 150% to 250%-280% and its NAND price increase forecast from about 100% to 200%-250%. In other words, Goldman believes this is not an ordinary inventory recovery cycle but a "super supply shortage cycle" driven by unprecedented AI computing demand, HBM's complex manufacturing crowding out capacity, and insufficient supply elasticity for general-purpose DRAM/NAND. GPUs generate intelligence, HBM/DRAM feeds data at high speed, enterprise NAND/eSSD handles hot data and caching, and HDDs store massive amounts of cold/warm data long-term. Therefore, Goldman argues that the AI computing arms race led by cloud giants is transforming memory chips from cyclical commodities into scarce strategic assets. The 2026 DRAM/NAND price increases are not the end but potentially the initial phase of a super cycle. Micron's actual financial performance has changed significantly in recent years. Traditionally viewed by Wall Street as a cyclical tech company producing commoditized memory, with substantial financial volatility, this situation has changed rapidly. In the most recent quarter, the company's operating cash flow approached $12 billion, adjusted free cash flow reached $7 billion, and it decided to raise its dividend by 30%, showing management's high confidence in current profit sustainability. Operationally, the company successfully launched 1-gamma DRAM, G9 NAND, and HBM3E products and expressed confidence in the HBM4 ramp. Furthermore, as the sole U.S. manufacturer of both DRAM and NAND, Micron's unique position in technology leadership and capacity holds significant importance for North American investors. Considering government support under the U.S. CHIPS Act and geopolitical risks, Micron's strategic importance in the tech rivalry has increased further. In other words, its business model has changed. In an era where global semiconductor autonomy is increasingly vital, Micron gains a competitive edge through its capacity advantage. From a valuation perspective, Micron's stock closed around $746 per share last Friday, with a staggering 180% year-to-date gain. However, its valuation stands at a forward P/E of 12.8x based on the fiscal 2026 consensus EPS of $58.11 and a mere 7.3x based on the fiscal 2027 consensus EPS of $101.78. Even if earnings approach the lower end of fiscal 2027 forecasts (around $70.77), the P/E would still be only about 10.5x, which is quite reasonable for a chip giant at the core of AI computing infrastructure. The forward EV/EBITDA valuation multiple is only about 8x, significantly lower than many hot software companies focused on AI workflows, such as Palantir, which trade at higher multiples despite having much slower free cash flow acceleration compared to Micron. Super Cycle Reshapes DRAM/NAND; Is Micron Headed Toward $1,000? Analyst Ben Reitzes and his team at Melius recently published a report stating that the AI boom will drive robust memory demand growth through the end of this decade (2030). According to Counterpoint Research data, the memory market has entered a "super bull market" or "super cycle" phase, with current supply-demand dynamics and prices far surpassing historical peaks seen during the 2018 cloud computing boom. Recently, Micron's stock performance has been exceptionally strong, rising 6.5% on Monday to close at a record high of $795.33, with its market capitalization approaching $900 billion. Deutsche Bank significantly raised its price target to $1,000, implying about 26% upside from the closing price, maintaining a "Buy" rating. Factors driving the stock include AI-driven structural long-term memory shortages, potential labor unrest at competitor Samsung that could harm its DRAM/NAND market share, and Micron's recent launch of high-capacity enterprise SSDs (like the 245TB Micron 6600 ION), which significantly enhance rack-level storage density and data center efficiency. On cash flow metrics, Micron has moved away from the typical characteristics of traditional cyclical memory giants: latest quarterly operating cash flow near $12 billion, adjusted free cash flow of $7 billion, and plans for a 30% dividend increase. Additionally, as mentioned, the company successfully achieved volume production of 1-gamma DRAM, G9 NAND, and HBM3E and remains confident in the HBM4 ramp. Latest data from South Korean customs shows DRAM and NAND prices continuing to surge, with monthly increases as high as 63%, HBM memory prices up 165.5% year-over-year, and flash memory prices up over 350% year-over-year. Supply-demand imbalance and AI-driven ultra-high-capacity demand are creating a structural upturn for the memory market, significantly strengthening Micron's position as a core supplier. Meanwhile, global IT spending forecasts indicate that 2026 spending on IT hardware, software, and services is expected to grow to $6.32 trillion, a 13.5% year-over-year increase. The share of data center systems spending within total IT spending has risen from 4.5% in 2012 to 12.5%, showing that AI-driven storage and computing investments are becoming a major engine for economic growth.

免责声明:投资有风险,本文并非投资建议,以上内容不应被视为任何金融产品的购买或出售要约、建议或邀请,作者或其他用户的任何相关讨论、评论或帖子也不应被视为此类内容。本文仅供一般参考,不考虑您的个人投资目标、财务状况或需求。TTM对信息的准确性和完整性不承担任何责任或保证,投资者应自行研究并在投资前寻求专业建议。

热议股票

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10