Nvidia supplier SK Hynix posts highest profit in 6 years on AI chip boom


A man walks past a logo of SK Hynix at the lobby of the company’s Bundang office in Seongnam on January 29, 2021.

Jung Yeon-Je | AFP | Getty Images

SK Hynix, one of the world’s largest memory chipmakers, on Thursday said second-quarter profit hit its highest level in 6 years as it maintains its leadership in advanced memory chips critical for artificial intelligence computing.

Here are SK Hynix’s second-quarter results compared with LSEG SmartEstimate, which is weighted toward forecasts from analysts who are more consistently accurate:

  • Revenue: 16.42 trillion Korean won (about $11.86 billion), vs. 16.4 trillion Korean won
  • Operating profit: 5.47 trillion Korean won, vs. 5.4 trillion Korean won

Operating profit in the June quarter hit its highest level since the second quarter of 2018, rebounding from a loss of 2.88 trillion won in the same period a year ago.

Revenue from April to June increased 124.7% from 7.3 trillion won logged a year ago. This was the highest quarterly revenue ever in the firm’s history, according to LSEG data available since 2009.

SK Hynix on Thursday said that a continuous rise in overall prices of its memory products — thanks to strong demand for AI memory including high-bandwidth memory — led to a 32% increase in revenue compared with the previous quarter.

The South Korean giant supplies high-bandwidth memory chips catering to AI chipsets for companies like Nvidia.

“We expect tight HBM and memory supply to persist until 2025 on a bottleneck in HBM production,” said SK Kim of Daiwa Capital Markets in a June 12 note.

“Accordingly, we expect a favourable price environment to continue and SK Hynix to record robust earnings in 2024-25, benefitting from its competitiveness in HBM for AI graphics processing unit and high-density enterprise SSD (eSSD) for AI-servers, leading to a rerating of the stock,” Kim said.

High-bandwidth memory chip supplies have been stretched thanks to explosive AI adoption fueled by large language models such as ChatGPT.

The AI boom is expected to keep supply of high-end memory chips tight this year, analysts have warned. SK Hynix and Micron in May said they are out of high-bandwidth memory chips for 2024, while the stock for 2025 is also nearly sold out.

Large language models require a lot of high-performance memory chips as such chips allow these models to remember details from past conversations and user preferences in order to generate humanlike responses.

SK Hynix has mostly led the high-bandwidth memory chip market, having been the sole supplier of HBM3 chips to Nvidia before rival Samsung reportedly cleared the tests for the use of its HBM3 chips in Nvidia processors for the Chinese market.

This is breaking news. Check back for updates.



Source link