A breakthrough algorithm from Alphabet that enhances AI inference efficiency is causing structural divergence within the memory chip sector: flash memory stocks remain under pressure, while high-bandwidth memory (HBM) related players show relative stability.
The U.S. memory chip sector faced pressure for two consecutive trading days, with Western Digital (which owns the SanDisk brand) plunging 11% at Thursday's close. On Friday, the sell-off spread to Asia-Pacific markets, with South Korean memory stocks extending declines. SK Hynix fell over 5% intraday, while Samsung Electronics dropped more than 4%.
However, thanks to HBM's critical role in AI training, Samsung Electronics and SK Hynix - suppliers of high-bandwidth memory for NVIDIA's AI accelerators - quickly stabilized. Samsung nearly recovered all losses, while SK Hynix narrowed its decline to 1%, demonstrating greater resilience compared to flash memory manufacturers. In contrast, stocks like Kioxia that had surged over 600% in previous months continued to trend lower.
Morgan Stanley analyst Tiffany Yeh noted in a research report that Alphabet's "TurboQuant" technology significantly improves AI inference efficiency by compressing memory usage and data movement, but "does not diminish demand for core memory chips like HBM." The market is gradually realizing that this technology poses a much greater threat to flash memory companies than to the HBM sector.
Flash memory stocks bore the initial impact, giving back substantial previous gains. Driven by artificial intelligence adoption expectations, flash memory and storage product manufacturers had attracted massive investor inflows in recent months. Since late August, Western Digital had surged over 1000%, while Kioxia rose more than 600%, significantly outperforming traditional memory giants like Samsung Electronics, SK Hynix, and Micron Technology.
However, market sentiment reversed this week. Investors began selling these stocks after recognizing the profound implications of Alphabet's technological breakthrough. The company announced that its TurboQuant algorithm can compress memory requirements for specific large language model operations by at least six times, significantly reducing overall AI operating costs. Markets worry this could reduce storage procurement demand from hyperscale data center operators like Meta, subsequently dragging down memory chip prices in smartphones and consumer electronics.
Bloomberg Intelligence analyst Jake Silverman noted in a report: "Since model weights need to be stored in GPU memory, HBM demand and DRAM produced by companies like Micron will likely remain unaffected. In comparison, NAND flash memory demand faces more substantial long-term impact."
In contrast to the continued decline in flash memory stocks, high-bandwidth memory related players demonstrated stronger resilience during this sell-off. Samsung Electronics recovered all losses by Friday, while SK Hynix's stock price essentially returned to pre-sell-off levels.
Analysts believe clear logic supports this divergence. During AI large language model training phases, GPU demand for HBM remains highly concentrated, while TurboQuant optimizes memory efficiency during inference without affecting core HBM demand for training. Samsung and SK Hynix had previously become market favorites during the initial AI investment wave due to their HBM products, and this advantageous position hasn't been shaken by the recent algorithm breakthrough.
It's worth noting that this memory chip sell-off occurred against a macroeconomic backdrop where overall technology stock valuations face scrutiny. Inflation concerns stemming from Middle East tensions have made markets cautious about high-valuation stocks, with investors highly sensitive to news flow and profit-taking likely to trigger at any time.
SGMC Capital Chief Investment Officer Ed Gomes stated that hardware demand driving AI technology implementation represents a long-term structural theme that "will continue evolving over years or even decades, not days or weeks." He considers the TurboQuant-related sell-off as "short-term noise creating good buying opportunities for quality stocks."
However, analysts also pointed out that as AI efficiency algorithms continue evolving, divergence within the memory chip sector may further intensify. Whether flash memory companies can regain their previous high-growth expectations remains to be seen.
Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.