As demand for AI accelerators soars, Micron says high-bandwidth memory constraints are intensifying across the industry, disrupting smartphone and PC supply chains and accelerating multibillion-dollar manufacturing expansions in the US and Asia.
The global shortage of memory chips is worsening faster than expected, driven largely by explosive demand from artificial intelligence infrastructure, according to Micron Technology. The US-based chipmaker says the supply crunch will persist beyond 2026 as high-end semiconductors are increasingly diverted to power AI workloads.
“The shortage we are seeing is really unprecedented,” Micron Executive Vice President of Operations Manish Bhatia said following the company’s recent groundbreaking ceremony for a $100 billion manufacturing complex near Syracuse, New York. The company had earlier warned that memory supply constraints would remain tight well into the coming years.
AI demand crowds out traditional markets
At the center of the shortage is high-bandwidth memory (HBM), a critical component used in AI accelerators developed by companies such as Nvidia. According to Micron, the rapid expansion of AI data centers is consuming a disproportionate share of global memory capacity, leaving limited supply for conventional products such as smartphones and personal computers.
Bhatia noted that manufacturers of PCs and smartphones are already attempting to secure long-term memory supplies beyond 2026. The impact is beginning to ripple across consumer electronics. Media reports in China indicate that major smartphone brands are cutting shipment targets for 2026 as rising memory costs squeeze margins, while industry trackers have warned of a potential decline in global smartphone shipments next year.
PC makers have also cautioned that prolonged memory shortages could affect production plans, adding further strain to an already fragile hardware supply chain.
Capacity sold out as chipmakers accelerate expansion
The three dominant memory manufacturers—Micron, Samsung Electronics and SK Hynix—have all benefited from the AI-driven surge, with strong demand pushing capacity utilization to its limits. SK Hynix has disclosed that its entire chip output for 2026 is already sold out, while Micron has said its AI-focused memory products are fully booked for the current year.
To prioritize large enterprise and AI customers, Micron has scaled back parts of its consumer memory business and is fast-tracking capacity expansion. The company recently announced a $1.8 billion investment in an existing facility in Taiwan, allowing it to bring new DRAM production online faster, with meaningful output expected from the second half of 2027.
Meanwhile, Micron’s long-term US expansion includes four massive DRAM fabrication plants in New York and additional fabs in Idaho. These projects support the company’s goal of manufacturing 40% of its DRAM in the US, backed by federal incentives under the Chips Act and expanded tax credits.
See What’s Next in Tech With the Fast Forward Newsletter
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.



