Nvidia’s AI Semiconductors to Feature Micron’s Latest Memory Chips

REUTERS/Aly Song/File Photo

Micron Technology has initiated the bulk production of its advanced memory chips, designed to power Nvidia’s latest artificial intelligence (AI) chips. This development, which propelled Micron’s shares to rise over 4% early Monday, marks a significant stride in the semiconductor industry. The chips, known as High Bandwidth Memory 3E (HBM3E), stand out for their energy efficiency, consuming 30% less power compared to similar products in the market. This feature positions Micron to capitalize on the burgeoning demand for semiconductors that facilitate generative AI applications.

Nvidia plans to incorporate Micron’s HBM3E chips into its forthcoming H200 graphic processing units (GPUs). These next-gen GPUs, anticipated to start distribution in the second quarter, are poised to surpass the performance of the existing H100 chips. The H100 has been instrumental in driving a significant revenue boost for Nvidia, thanks to its pivotal role in AI applications.

The spotlight on high-bandwidth memory (HBM) chips, especially with Nvidia’s endorsement, has sparked optimism among investors regarding Micron’s prospects. This optimism is fueled by the potential for Micron to navigate the sluggish recovery in its other market segments successfully. Leading the HBM market is SK Hynix, Nvidia’s current supplier, but Micron’s entry with its HBM3E chips is set to intensify competition.

Advertisement

The profitability of HBM chips for Micron is partly attributed to the intricate manufacturing process they require. Micron has projected that its HBM product line will generate “several hundred million” dollars in revenue for the fiscal year 2024, with expectations for continued growth into 2025. This venture into HBM3E production not only underscores Micron’s technological prowess but also aligns with the industry’s shift towards more energy-efficient and high-performance computing solutions, particularly in the rapidly evolving field of artificial intelligence.

author avatar
Staff Report

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Advertisement