News Hub

SK Hynix see potential to double its market value amid AI chip demand

Written by Thu 11 Jan 2024

Image Credit: Reuters

South Korean chipmaker, SK Hynix, sees potential for its market value to double in three years to £119.2 billion ($152 billion) through its AI memory chips.

Reuters reported CEO at SK Hynix, Kwak Noh-Jung, said the significance of memory will increase with the growing prevalence of Generative AI. He added that with the evolution of AI systems, there will be a diversification in customer demands for memory.

“If we prepare the products we are currently producing well, pay attention to maximising investment efficiency, and maintaining financial soundness, I think we can attempt to double the current market capitalisation of 100 trillion won to 200 trillion won within three years,” said Kwak.

Kwak added SK Hynix has consolidated its internal HBM capabilities to stay ahead. This has followed rivals Samsung Electronics and Micron having developed their versions of the next generation called HBM3E.

HBM chips are a type of computer memory designed to offer high-bandwidth and low power consumption.

The first HBM memory chip was produced by SK Hynix in 2013. Their HBM3 chips facilitate high-speed computations by efficiently handling and feeding more data into Generative AI chips. These chips were adopted by NVIDIA.

Regarding the potential end of SK Hynix’s production cuts, Kwak said adjustments are anticipated in the first quarter for DRAM chips used in tech devices. However, the chipmaker intends to assess and respond to market conditions after the mid-year for NAND flash chips used in data storage.

Earlier this month, SK Hynix said it aims to raise £791.7 million ($1 billion) in a bond deal. The issue amount has not been finalised and will be decided later, considering market conditions.

Throughout Q1 to Q3, the semiconductor firm reported combined operating losses of £4.87 billion ($6.19 billion), attributed to the slowdown for commodity chips used in computers and smartphones. SK Hynix is expected to redirect its resources to maintain leadership in high bandwidth memory (HBM) chips used in generative AI.

In November, NVIDIA unveiled a new HBM offering, the NVIDIA HGX H200, based on its Hopper architecture. The H200 is notable for being the first GPU to incorporate HBM3e, offering faster and larger memory capabilities.

Join Big Data & AI World

6-7 March 2024, ExCeL London

Be at the forefront of change with thousands of technologists, data specialists, and AI pioneers.

Don’t miss the biggest opportunities to advance your business into the future.

Written by Thu 11 Jan 2024

Send us a correction Send us a news tip