Micron Is Spending $200 Billion to Break the AI Memory Bottleneck

Dow Jones
Yesterday

Each afternoon at around 4:30, the earth here shakes from a series of controlled explosions, as engineers blast through basalt bedrock to flatten out the ground underneath a gigantic new semiconductor factory.

Micron Technology is the largest American maker of memory chips -- the tiny slices of silicon that store and transfer data and help power everything from smartphones and car computers to laptops and data centers. Micron is rushing to add manufacturing capacity to avert the biggest supply crunch the memory industry has seen in more than 40 years.

In Boise, where the company is based, Micron is spending $50 billion to more than double the size of its 450-acre campus, including the construction of two new chip factories, or fabs. The first fab's inaugural silicon wafers are expected to roll off the factory line in mid-2027, making DRAM, a type of memory used to make the high-bandwidth memory chips, or HBM, that are increasingly essential to advanced artificial-intelligence computing. Both plants should be in production by the end of 2028.

Each fab will be 600,000 square feet -- the size of more than 10 football fields -- making them some of the biggest "clean rooms" ever built in America. To prepare the site, engineers have already blasted through more than 7 million pounds of dynamite. An army of construction workers, building contractors and architects have set up a small city's worth of trailers so they can work around the clock.

Each Boise fab is expected to use 70,000 tons of steel (almost as much used to build the Golden Gate Bridge) and 300,000 cubic yards of concrete (enough for four Empire State Buildings).

That's not all. Near Syracuse, Micron just broke ground on a $100 billion fab complex that represents the state of New York's largest-ever private investment. Late last year, Micron announced a $9.6 billion fab investment in Hiroshima, Japan, while competitor SK Hynix announced in January that it would build a $13 billion fab in South Korea, in addition to a $4 billion manufacturing complex it is building in Indiana.

Behind the frenetic manufacturing arms race is the AI boom. As large language models have become increasingly complex and firms such as OpenAI, Oracle, xAI and Anthropic have announced lofty plans to build trillions of dollars worth of data centers, demand has far outpaced capacity in the memory-chip market.

That is because the processors designed by companies including Nvidia, Google, Broadcom and Advanced Micro Devices require more and faster memory chips for both model training and inference, or the process of responding to queries.

"I've been here for 28 years, and I've never seen anything so disruptive as AI," said Scott Gatzemeier, the Micron vice president who is heading the company's $200 billion U.S. expansion. "As we started to transfer from training to inference, the amount of data required just exploded, and we just didn't have enough clean-room capacity to satisfy demand. We realized we had a huge problem."

The shortages have resulted in a gold rush for memory-chip manufacturers like Micron and its two biggest competitors, SK Hynix and Samsung. Since April last year, Micron's share price has risen more than sixfold, to around $414, making the company worth nearly half a trillion dollars.

As the company has shifted away from simpler memory chips that go in mobile and other devices and toward more-profitable products such as data-center HBM chips, its gross margins have shot up as well, from 18.5% in early 2024 to 56% in its most recent quarterly report. Micron said it expects gross margins in the current quarter to hit 68% -- approaching the more than 73% that Nvidia earns on sales of its flagship products, known as graphics processing units, or GPUs.

The steep profits are a new phenomenon. For decades, memory chips have been considered a commodity product, cheaper and easier to make than the advanced AI chips designed by other chip firms.

"Our business is on an extraordinary trajectory," said Mark Murphy, Micron's chief financial officer, at an investor conference on Wednesday.

Murphy said that Micron is currently able to meet about one-half to two-thirds of demand for some key customers. Increasingly, buyers are approaching the company seeking to lock in multiyear purchasing contracts to ensure supply and attempt to avoid dramatic price increases.

"On the supply side, we are doing everything we can to add capacity," he said. "But there is no easy or fast way to get that done."

Historically, companies like Micron have been highly vulnerable to boom-and-bust cycles. Supply-chain shocks and economic downturns could send every memory-maker into a tailspin.

After sales of devices like PCs, tablets and smartphones surged during the Covid-19 pandemic, rising interest rates and inflation caused a pullback in purchases by businesses and consumers, leaving tech companies with warehouses full of unused memory chips. Shares of Micron and its top competitors fell sharply in 2022, wiping out tens of billions in value.

In response, memory-makers cut production sharply to stabilize prices. Starting in 2024, however, the rapid rise of AI caused a surge in demand for memory. Taiwan's Commercial Times newspaper reported in November that contract prices for DRAM chips have risen by more than 170% in the previous year alone.

Circular Technology, a reseller of data-center hardware based in Southborough, Mass., says that prices for DDR5 chips -- another type of memory that is typically attached to the CPUs that power AI data centers -- have risen nearly 500% since September, a sign of how the effect of the shortages has spread throughout the memory market, beyond just HBM chips.

"We're nowhere near the end of the shortage," said Brad Gastwirth, Circular's head of global research. "I think it lasts through the end of 2026 and at least the first half of 2027."

Between August and October, as AI model developers, cloud-services companies and so-called hyperscalers announced project after enormous project, Micron started noticing that HBM demand was going through the roof, said Sumit Sadana, the company's chief business officer.

The company was already hard at work building ID1 (short for "Idaho 1"), its first of two enormous new fabs in Boise. Because of the stepped-up pace of data-center construction, Micron decided to accelerate work on ID2, the second fab.

"Memory has gone from being a system component to being a strategic asset," Sadana said. "The promise of AI is all ahead of us."

Despite the surging demand, Micron, which owns a much smaller market share than SK Hynix, is still highly susceptible to perceived changes in the competitive landscape.

In early February, Micron's share price briefly took a hit after SemiAnalysis, an influential chips-industry research shop, reported that Micron's latest AI memory chips, known as HBM4, had failed to win a spot as a supplier for Nvidia's newest generation of AI servers, which launch this year and are known as Vera Rubin.

The reports suggested that Nvidia was unhappy with the speed at which Micron's chips were able to transmit data and had decided to go with other suppliers instead. Nvidia declined to comment.

Sadana said that the reports were inaccurate and that Micron is already making customer shipments of HBM4 and expects to ship more next quarter. Supplies of HBM4 and HBM3e, the previous generation of high-bandwidth memory chips, are sold out through the end of this year, he added.

At the request of the copyright holder, you need to log in to view this content

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Most Discussed

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10