Business

Nvidia supplier SK Hynix posts highest profit in 6 years on AI chip boom


A man walks past the logo of SK Hynix at the lobby of the company’s Bundang office in Seongnam on Jan. 29, 2021.

Jung Yeon-Je | AFP | Getty Images

SK Hynix, one of the world’s largest memory chipmakers, said Thursday its second-quarter profit hit a six-year high as it maintained its leadership in advanced memory chips crucial for artificial intelligence computing.

Here are SK Hynix’s Q2 results compared to LSEG SmartEstimate, which is based on analysts’ forecasts with higher accuracy:

  • Revenue: 16.42 trillion Korean won (about $11.86 billion), compared with 16.4 trillion Korean won
  • Operating profit: 5.47 trillion Korean won, vs. 5.4 trillion Korean won

Operating profit in June reached its highest level since the second quarter of 2018, recovering from a lost 2.88 trillion won in the same period last year.

Revenue from April to June rose 124.7 percent from the 7.3 trillion won posted a year earlier, the highest quarterly revenue ever in the company’s history, according to LSEG data going back to 2009.

SK Hynix said on Thursday that the continued rise in overall memory product prices — driven by strong demand for AI memory including high-bandwidth memory — led to a 32% increase in revenue compared to the previous quarter.

The South Korean giant supplies high-bandwidth memory chips for AI chipsets to companies like Nvidia.

Shares of SK Hynix fell as much as 7.81% on Thursday morning.

The decline comes as South Korea Kospi The index lost as much as 1.91% after US tech stocks sold off overnight, following disappointing Alphabet And Tesla earnings. These reports mark the first time investors get a look at how large-cap companies performed in the second quarter.

“In the second half of the year, strong demand from AI servers is expected to continue as well as a gradual recovery in mainstream markets with the launch of AI-enabled PCs and mobile devices,” the company said on its earnings call on Thursday.

Taking advantage of strong AI demand, SK Hynix plans to “continue to lead the HBM market by mass producing 12-layer HBM3E products.”

The company will Started mass production of 12-layer HBM3E This quarter after providing samples to major customers and is expected to deliver to customers in Q4.

Limited supply

Memory leaders such as SK Hynix have been aggressively expanding HBM capacity to meet growing demand for AI processors.

HBM requires larger wafer capacities than conventional dynamic random-access memory products – a type of computer memory used to store data – which SK Hynix said is also struggling due to tight supply.

“Investment demand is also increasing to meet demand for conventional DRAM as well as HBM, which requires larger wafer capacity than conventional DRAM. Therefore, this year’s capex is expected to be higher than what we expected at the beginning of the year,” SK Hynix said.

“Although excess capacity is expected to increase next year due to increased industrial investment, a significant portion of it will be used to increase HBM production. Therefore, the current tight supply of conventional DRAM is likely to continue.”

In a June 12 note, SK Kim of Daiwa Capital Markets said it expects “HBM and memory supply to remain tight until 2025 due to bottlenecks in HBM production.”

“Accordingly, we expect the favorable pricing environment to continue and SK Hynix to post strong earnings in 2024-25, benefiting from the company’s competitiveness in HBM for AI graphics processors and high-density enterprise SSDs (eSSDs) for AI servers, leading to a re-pricing of the stock,” Kim said.

The supply of high-bandwidth memory chips has been expanded by the explosive adoption of AI thanks to large language models like ChatGPT.

China is the biggest existential risk for chip stocks, says Melius' Ben Reitzes

The AI ​​boom is expected to continue High-end memory chip supply is tight this year.analysts have warned. In May, SK Hynix and Micron said they were running out of high-bandwidth memory chips for 2024, while inventories for 2025 were also running low.

Large language model requires a lot of high performance memory chips So, these chips allow these models to remember details from previous conversations and user preferences to generate human-like responses.

SK Hynix mainly leads the high-bandwidth memory chip market, being the sole supplier of HBM3 chips to Nvidia ahead of rival Samsung is said to have passed the tests to use HBM3 chips in Nvidia processors for the Chinese market.

The company said it expects to ship the next generation of 12-layer HBM4 in the second half of 2025.

News7f

News 7F: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button