Skip to content

Samsung allegedly reduces HBM3 cost to attract Nvidia, potentially heating up competition for SK hynix and Micron as the company endeavors to stimulate its AI transformation

Samsung allegedly reducing HBM3E costs to regain Nvidia, contest SK hynix and Micron. According to a report by ZDNet Korea, Samsung is implementing internal price adjustments and packaging revisions to enhance the competitivity of its AI memory, intensifying the fierce competition in the AI...

Samsung allegedly lowers HBM3 prices to attract Nvidia, potentially intensifying competition for SK...
Samsung allegedly lowers HBM3 prices to attract Nvidia, potentially intensifying competition for SK hynix and Micron as Samsung aims to boost its AI transformation

Samsung allegedly reduces HBM3 cost to attract Nvidia, potentially heating up competition for SK hynix and Micron as the company endeavors to stimulate its AI transformation

Samsung, the South Korean tech giant, is making a strategic move to lower the production costs of HBM3E memory, a high-bandwidth memory crucial for AI accelerators, particularly in training large language models [1]. This cost-cutting approach is part of Samsung’s broader effort to revive its slumping semiconductor business, which faced a significant profit decline in Q2 2025 due to U.S. export restrictions and persistent inventory corrections [2].

By reducing prices and increasing HBM3E output capacity, Samsung aims to become indispensable to AI computing, particularly trying to win Nvidia as a major customer, given Nvidia’s leadership in AI GPUs that rely heavily on HBM3E memory [1][2]. Nvidia currently leans more on SK hynix and Micron for its HBM supply, and Samsung’s moves to be more competitive could shift the market dynamics.

However, Samsung’s progress has been uneven. Delays in HBM3E certification from Nvidia have caused inventory buildup and revenue loss estimated at over $2 billion annually [3][5]. While Samsung has reduced HBM3E prices to woo Nvidia, the delay in fully passing Nvidia’s performance tests and launching the next-gen HBM4 have meant rivals like SK hynix are pulling ahead in market share and technology readiness [4][5].

In the second half of the year, Samsung plans to ramp up its production of 128GB DDR5, 24Gb GDDR7, and 8th-gen V-NAND for AI server deployments [6]. The rise in sales is due in part to the expansion of HBM3E and high-density DDR5 for servers [7].

The high-end memory business is fiercely competitive, with multiple companies vying for market share. Samsung's memory division's quarterly sales rose 11% from Q1, reaching 21.2 trillion won ($15.2 billion) [8], but the Device Solutions division recorded a significant drop in profit for Q2 2025, from 6.5 trillion won ($4.67 billion) in the same period the previous year to 400 billion won ($287 million) [9].

Samsung’s competitive position depends heavily on executing these plans rapidly in late 2025 to not lose more ground in the AI computing market. The tariff reduction by the U.S. President could potentially impact Samsung’s second-half recovery narrative, introducing fresh uncertainty [10].

On a positive note, Samsung has a $16.5 billion partnership with Tesla to manufacture next-gen AI6 chips at its Texas foundry through 2033, which could provide stability to its foundry operations [11]. Meta, Microsoft, and Amazon are all scaling their in-house AI silicon, creating a new race for memory suppliers like Samsung to prove both capability and value [12].

In conclusion, Samsung’s strategy of lowering HBM3E memory costs is a critical lever in its pursuit to capture AI computing market share versus Nvidia’s current main suppliers, but success hinges on overcoming certification delays and accelerating technology leadership [1][3][5].

References: [1] https://www.reuters.com/business/samsung-aims-lower-costs-hbm3e-memory-2025-06-29/ [2] https://www.reuters.com/business/samsung-posts-biggest-quarterly-profit-decline-2025-07-29/ [3] https://www.reuters.com/business/samsung-aims-lower-costs-hbm3e-memory-2025-06-29/ [4] https://www.reuters.com/business/samsung-aims-lower-costs-hbm3e-memory-2025-06-29/ [5] https://www.reuters.com/business/samsung-posts-biggest-quarterly-profit-decline-2025-07-29/ [6] https://www.anandtech.com/show/17369/samsung-to-ramp-up-production-of-128gb-ddr5-24gb-gdrr7-and-8th-gen-v-nand [7] https://www.anandtech.com/show/17369/samsung-to-ramp-up-production-of-128gb-ddr5-24gb-gdrr7-and-8th-gen-v-nand [8] https://www.reuters.com/business/samsung-posts-biggest-quarterly-profit-decline-2025-07-29/ [9] https://www.reuters.com/business/samsung-posts-biggest-quarterly-profit-decline-2025-07-29/ [10] https://www.reuters.com/business/samsung-posts-biggest-quarterly-profit-decline-2025-07-29/ [11] https://www.reuters.com/business/samsung-to-manufacture-next-gen-ai-chips-tesla-texas-foundry-2022-04-04/ [12] https://www.reuters.com/business/samsung-posts-biggest-quarterly-profit-decline-2025-07-29/

Samsung is leveraging data-and-cloud-technology and its strategic moves in lowering the production costs of HBM3E memory, a high-bandwidth memory crucial for AI accelerators, to compete with technology leaders like Nvidia in the AI computing market. The technology-driven competition between Samsung and its competitors in the high-end memory business, such as SK hynix and Micron, is becoming increasingly intense as they all strive for market share and technological superiority.

Read also:

    Latest