Micron Technology has raised its fourth-quarter revenue and profit forecast, driven by surging demand for its high-bandwidth memory chips used in artificial intelligence (AI) infrastructure.
The company now expects revenue of $11.2 billion ± $100 million, compared to its earlier projection of $10.7 billion ± $300 million. Adjusted earnings per share are forecast at $2.85 ± $0.07, up from the prior $2.50 ± $0.15. Micron also lifted its adjusted gross margin forecast to 44.5 percent ± 0.5 percent, from the earlier 42 percent ± 1 percent, citing improved pricing — particularly for DRAM products.
Demand for AI-focused memory chips has surged as tech giants expand investments in AI data centers. Analysts believe Micron’s margins could see further gains as it shifts toward higher-margin chips and benefits from rising consumer memory prices.
Analysts at Bank of America’s Global Research project that Micron could capture 20–25 percent of the global HBM market by the end of this year. SK Hynix currently leads with close to a 50 percent market share.
Micron itself expects its HBM market share to align with its broader DRAM market share by the second half of 2025.
The AI memory market is poised for rapid growth, with Nvidia supplier SK Hynix projecting annual expansion of about 30 percent through 2030, Reuters news report indicated earlier. While U.S. tariffs of up to 100 percent on some chip imports may pose challenges, they will not affect companies manufacturing domestically or committing to U.S.-based production.
In June, Micron pledged an additional $30 billion investment in U.S. manufacturing, bringing its total commitment to $200 billion, aligning with the Trump administration’s push to strengthen domestic chip production.
AI chip market
The recent Trendforce report said that rising demand for AI servers is prompting major US cloud service providers (CSPs) to accelerate in-house ASIC development, with new versions arriving every one to two years.
In China, US export controls introduced in April 2025 are expected to cut the share of imported chips from NVIDIA and AMD from 63 percent in 2024 to about 42 percent in 2025, while domestic suppliers like Huawei are projected to reach 40 percent market share, aided by strong government backing.
US CSPs are using ASICs to reduce reliance on third-party suppliers, control costs, and improve performance.
Google is leading with its TPU v6 Trillium and a new dual-sourcing strategy with MediaTek, AWS is advancing its Trainium v2 and v3 chips, Meta is developing MTIA v2 with Broadcom, and Microsoft is progressing on its Maia series with Marvell and GUC.
In China, Huawei is strengthening its Ascend chip lineup for domestic AI needs, and Cambricon is scaling its Siyuan chips after trials with major CSPs. Chinese CSPs like Alibaba, Baidu, and Tencent are also pushing their own AI chip designs.
TelecomLead.com News Desk