By Yoolim Lee and Ian King
Samsung Electronics Co., after a series of setbacks in developing the type of memory chips crucial for the artificial intelligence market, is beginning to make progress in narrowing the gap with rival SK Hynix Inc.
Samsung Electronics Co., after a series of setbacks in developing the type of memory chips crucial for the artificial intelligence market, is beginning to make progress in narrowing the gap with rival SK Hynix Inc.
Samsung has made important headway in its comeback, including winning the long-awaited approval from AI giant Nvidia Corp. for a version of its high-bandwidth memory chips called HBM3, according to people familiar with the matter. It also anticipates approval for the next generation, HBM3E, in two to four months, said the people, who asked not to be identified discussing internal developments.
The advances come after months of stumbles, including development missteps that allowed the smaller SK Hynix to jump out to a huge lead in the fast-growing sector. It’s unusual — and humbling — for South Korea’s largest company to be playing this kind of catchup. Historically, Samsung has led the way in the memory chip market, capitalizing on its scale and engineering expertise. As the company struggled in the HBM field, it took the highly unusual step of replacing the head of its semiconductor division in May.
“We’ve never seen Samsung in this position,” said Jim McGregor, an analyst at Tirias Research. “The industry and Nvidia more than anyone need Samsung, but they need Samsung to be firing on all cylinders.”
The company declined to comment on any specific partners but said, in general, it’s collaborating closely with customers and testing is proceeding smoothly.
Samsung’s latest accomplishments are likely to position the company to capitalize on the booming demand for AI products. The HBM market is projected to rise from $4 billion last year to $71 billion in 2027, according to Morgan Stanley. The faster Samsung gets the blessing of Nvidia, the leader in making AI accelerators, the more revenue it can reap from that increase.
More From This Section
“Investors’ perception on Samsung could change soon,” Morgan Stanley analysts Shawn Kim and Duan Liu wrote in a research report this month. “Things are improving quickly.”
The pair anointed Samsung their top stock pick in the report because they figure the company could grab incremental HBM market share of at least 10 per cent in 2025, adding about $4 billion in revenue. While it would still trail SK Hynix in the field, that progress could shift investor perceptions and lift shares.
Samsung is likely to face questions about its HBM strategy when it reports final second-quarter earnings on Wednesday. It’s not clear how much detail the company will provide.
While Samsung appears on track to secure Nvidia’s seal of approval by November, the company is still struggling to resolve certain issues, with unpredictable outcomes given the complexity of AI chips. There’s a chance its timeline will slip into 2025, the people said.
Samsung’s missteps came during an unusual period for the company. Executive Chairman Jay Y. Lee spent years battling prosecutors over bribery and corruption allegations, and, in the meantime, senior leaders didn’t see HBM as a priority. Indeed, the market was a rounding error until OpenAI unveiled ChatGPT in late 2022 and set off a frenzy of demand for the Nvidia chips used to train AI models.
While SK Hynix was ready for the surge, Samsung struggled with the complex engineering problems of the new chips. HBM is made up of a collection of DRAM chips stacked on top of each other, eight high in the most recent generation. Each layer generates substantial heat, and then they are packed with Nvidia’s graphics processing unit, or GPUs, which can reach 100 degrees C on their own. The entire stack risks melting down without the proper dissipation and cooling materials.
“As you increase those layers, the challenge of developing a reasonable yield becomes more difficult,” said Jake Silverman, an analyst with Bloomberg Intelligence. “The issue is thermal: it runs hot because it’s stacked DRAM. It’s really close to the GPU, which runs even hotter.”
Samsung had trouble solving this so-called thermal coupling, according to one of the people, who asked not to be named discussing confidential work. In May, the company took dramatic action: It announced Kyung Kye-hyun, the head of the semiconductor division, would step aside and Jun Young-hyun would take his place.
Jun, who joined Samsung in 2000 and helped develop its DRAM and flash memory chips, quickly stepped up the pressure to find solutions. The 63-year-old convened a series of meetings to probe technical details and find the root cause of the problem. In one meeting that lasted hours without break, he lamented that HBM could be part of a broader problem, according to a person familiar with the matter.
Samsung risked falling behind not just on memory chip technicalities, but also in terms of urgency in innovation. To boost collaboration, he reorganized the team dedicated to HBM and appointed a new head.
Samsung uses a heat management strategy called thermal compression non-conductive film, or TC-NCF, to insulate each layer of DRAM. SK Hynix, on the other hand, pioneered an alternative to improve heat dissipation and production yields.
Yet Samsung opted to stick with TC-NCF and improve it, rather than considering other approaches. A company spokesperson said TC-NCF is “well-proven technology” that it will use in future products.
Ultimately, the company modified the HBM design to address heating and power consumption issues, the people said. That led to the approval of HBM3 for Nvidia.
Samsung said that Jun, since taking over, has prioritized the company’s culture of collective discussion and persistence in solving problems. It added there has been “no issue related to heating and power consumption in our HBM products” and it has made “no design changes” for specific customers.
Samsung’s saving grace may be that much of AI’s growth lies ahead. Tech firms like Microsoft Corp., Google-parent Alphabet Inc., Amazon.com Inc., Apple Inc. and Meta Platforms Inc. are all investing huge sums to develop their capabilities.
Samsung has been producing HBM3 chips since the second half of last year, according to details from its quarterly reports. Companies like Google, which design their own chip capabilities, are expected to continue using HBM3 for much of this year. Samsung has started supplying HBM3 to Nvidia for its H20 chip, a product customized for China to meet US export controls.
As for HBM3E, the technology hit the market for the first time this year as Nvidia paired the SK Hynix chip with its own H200. Nvidia will keep using HBM3E with virtually all of its products through 2025 and chip rivals will stick with it even in 2026, Sanford C. Bernstein analysts said in a July report.
“Samsung is late, but the window of HBM3E will remain open for Samsung to catch up,” the analysts, led by Mark Li, wrote.
In a sign of its tardiness, Micron Technology Inc. announced earlier this year that Nvidia had approved its HBM3E chips for use with the company’s AI gear. Micron, which has historically trailed its Korean rivals in scale, is now claiming leadership in some areas of memory manufacturing and product introduction, a further sign of the erosion of Samsung’s dominance.
One significant advantage Samsung holds however is its financial resources and production capacity. Once it meets Nvidia’s criteria for approval, it can ramp up output rapidly, addressing shortages that have held back Nvidia and other AI advocates.
“Micron and Hynix don’t have the capacity yet to support the entire market,” said Bloomberg Intelligence’s Silverman. Nvidia Chief Executive Officer Jensen Huang “wants to encourage them” because he needs more supply, he added.
SK Hynix isn’t letting up. It’s in the rare position of stealing the spotlight from its higher-profile rival — its shares have surged more than 150 per cent since the start of 2023, more than triple Samsung’s performance.
SK Hynix said last week it’s accelerating production of HBM3E products to grab growth of more than 300%. The company also said it plans to mass produce next-generation, 12-layer HBM3E chips this quarter and start supplying a customer in the fourth quarter, a likely indication that certification from Nvidia is imminent.
Under Jun’s leadership, Samsung is making progress. It developed its own 12-layer HBM3E technology and is working to get Nvidia’s approval for that generation of chips, as well as eight-layer HBM3E. It’s an indication of the market’s promise.
“This is a $71 billion revenue opportunity by 2027e (on our estimates), and growing, that did not exist two years ago,” the Morgan Stanley analysts wrote. “The key debate for Samsung is whether it can execute as a strong second source to Nvidia.”