Kioxia Soars 1200%: Storage Veteran Takes Helm to Ride AI Supercycle, Elevating eSSD to Core AI Infrastructure

Stock News
Jan 29

Following a staggering 1200% surge in its stock price, Japan-based global NAND flash leader Kioxia Holdings Corp. has appointed a seasoned storage industry veteran, Executive Vice President Hiroo Ota, as its new Chief Executive Officer and President. This strategic move aims to accelerate the company's market share expansion within the booming memory chip sector, precisely as NAND flash transitions from a cyclical commodity into a cornerstone component of AI computing infrastructure. Against the backdrop of seemingly "endless" storage demand driven by the torrent of AI inference computing, the enterprise-grade data center SSD (eSSD) segment, where Kioxia is a dominant force, has exploded into one of the hottest investment themes within the global AI computing narrative.

It is understood that the 63-year-old Ota will succeed the 70-year-old Nobuo Hayasaka, who will assume the role of senior executive advisor at Kioxia. This core storage chip supplier for Apple's iPhone stated in a Thursday announcement that the leadership transition will become official following shareholder approval at the annual general meeting scheduled for June. Throughout 2025 and continuing into the start of 2026, memory chip stocks, particularly those of high-end storage products, have indisputably been among the most sizzling investment themes in global equity markets. For instance, the stock of SanDisk (SNDK.US), a leader in data center enterprise-grade SSD components, has surged over 122% year-to-date in 2026, building on a monumental 580% gain for the entirety of 2025.

Despite the parabolic super-bull run in 2025 and its continuation into early 2026, global investors show little concern over the suddenly elevated valuations of these memory technology companies. This confidence stems from a belief that the unprecedented wave of AI data center construction is fundamentally altering the "strongly cyclical nature" of the memory chip industry. Whether it's Google's massive TPU AI computing clusters or vast arrays of Nvidia AI GPU clusters, all rely on fully integrated HBM memory systems paired with AI chips. Furthermore, the accelerated construction and expansion of AI data centers by tech giants necessitate massive purchases of server-grade DDR5 memory and high-performance enterprise-grade SSDs/HDDs. Memory leaders like Samsung Electronics, SK Hynix, Micron Technology, and Kioxia are strategically positioned across these three critical storage segments: HBM, server DRAM (including DDR5/LPDDR5X), and high-end data center SSDs/HDDs. These giants are the most direct beneficiaries within the "AI memory + storage stack," essentially reaping the "super dividends" of the AI infrastructure boom.

According to Wall Street giants like JPMorgan, the biggest winners of this unprecedented "memory super-cycle" are likely to be in the eSSD segment, dominated by Samsung, SK Hynix, Kioxia, and SanDisk. JPMorgan argues that the AI inference wave is liberating NAND flash from its fate as a "strongly cyclical commodity," transforming it into a high-growth AI infrastructure asset. The bank's analysis suggests that as AI workloads shift from training to inference, and with HDDs facing supply bottlenecks in near-line storage, NVMe eSSDs focused on the enterprise storage hot tier are experiencing unprecedented structural growth.

Kioxia's stock price has skyrocketed over 1200% since its IPO on the Japanese stock market in late 2024. Following the announcement of Ota's appointment, Kioxia's shares reversed earlier losses and climbed approximately 2% in Tokyo trading, underscoring the market's strong confidence in this storage industry veteran. Spun off from Toshiba Corp. and as the inventor of NAND flash technology, Kioxia has emerged as a core beneficiary of the AI infrastructure building frenzy, which is driving up demand and prices for virtually all memory chips in the DRAM and NAND sectors. Its high-performance storage solutions, including enterprise-grade HDDs and eSSDs, are becoming increasingly critical for globally accelerating AI data center construction, elevating their status from mere commodities to indispensable necessities for any tech company aiming to win the AI race.

The incoming CEO, Hiroo Ota, has spent the majority of his career in the storage semiconductor industry, having joined Toshiba back in 1985. He previously held core engineering roles at Toshiba, the dominant storage powerhouse of the last century, which later fully transitioned into Kioxia. "Being the CEO of a NAND flash company is an extremely challenging job," noted Akira Minamikawa, an analyst at Omdia. Given the industry's global reputation for unpredictable boom-and-bust cycles, he remarked that timing long-term investments has always been critical, but this "cyclical" label appears to be fading during the current super-cycle. "Leaders need to be on the front lines, frequently traveling and deeply involved in sales, staying acutely sensitive to subtle shifts in customer sentiment," Minamikawa added. The company stated in a release that its board determined the present moment was opportune for transition, as Kioxia marks its one-year IPO anniversary and seeks to accelerate growth amidst the eSSD demand surge fueled by the AI computing frenzy.

The fundamental reason Kioxia stands as a core beneficiary of this "memory super-cycle" lies in the exabyte-scale data generation, retention, and retrievability demanded by the massive AI training/inference workloads in AI data centers. Both training and inference are continuously driving demand for high-capacity, low-TCO (cost per TB + power consumption), high-throughput, and low-latency storage—requirements perfectly addressed by Kioxia's enterprise NVMe SSDs. Kioxia's three primary enterprise storage product lines—the QLC high-capacity enterprise drive (LC9), PCIe 5.0 enterprise/data center NVMe (CM/CD series), and the low-latency SCM direction (XL-FLASH)—seem almost tailor-made for AI inference workloads.

JPMorgan forecasts a significant 40% year-on-year increase in the blended average selling price for the NAND industry in 2026. More crucially, this price rise is not expected to be short-lived, with the bank anticipating prices will remain near historic highs throughout 2027. A common misconception among investors is that the AI computing boom primarily benefits DRAM (especially HBM), with NAND playing a secondary role. However, JPMorgan dedicates considerable space in a research report to correct this view, asserting that in the era of AI inference, the importance of eSSD is no less than that of HBM. Looking ahead, JPMorgan predicts that the total storage capacity demand per AI server will exceed 70TB, at least triple that of a general-purpose server (around 20TB). By 2027, the bank expects eSSDs to account for 48% of global NAND bit demand, decisively surpassing the long-standing pillars of smartphones (30%) and PCs (22%) to become the largest demand driver for NAND.

The most significant recent catalyst for eSSD undoubtedly comes from AI chip leader Nvidia's newly launched "Inference Context Memory Storage" (ICMS) platform for context/long-context inference. This platform serves as a major long-term growth catalyst for "hot/warm tier enterprise SSDs/NVMe" as well as cold/warm data lakes (i.e., Nearline HDD/object storage). In a technical blog, Nvidia explicitly positions ICMS as a new G3.5 tier: a pod-level "Ethernet-attached flash tier," specifically designed to host latency-sensitive, reusable inference context (KV cache). Nvidia emphasizes that it provides "petabyte-scale shared capacity" per GPU pod, using higher bandwidth and greater power efficiency to support KV cache reuse and prefetching, thereby reducing GPU stalls and improving tokens per second. Translating this into a "storage beneficiary logic": this platform introduces a new high-bandwidth flash tier between local SSDs and shared storage specifically for "context (KV cache)," consequently driving increased demand for more enterprise-grade SSDs/NVMe (including SSD pools in NVMe-oF form factors).

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Most Discussed

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10