Back To Top

AI chatbot services will boost chip demand, says SK hynix chief

SK hynix Vice Chairman Park Jung-ho speaks at a symposium hosted by Doheon Academy of Hallym University, in central Seoul, Wednesday. (SK hynix)
SK hynix Vice Chairman Park Jung-ho speaks at a symposium hosted by Doheon Academy of Hallym University, in central Seoul, Wednesday. (SK hynix)

The proliferation of chatbots using artificial intelligence will become "killer applications" in boosting demand for advanced memory chips, SK hynix Vice Chairman Park Jung-ho said at a symposium Wednesday.

Park, who also serves as the chief executive officer of the world's second-largest memory chip maker, said memory chips will always be at the center of innovation as the era of artificial intelligence unfolds.

"As the era of AI unfolds, unsolved challenges of the past will be solved and innovative products and services in the fields of autonomous vehicles, robots and biotechnology will emerge to change our lives completely," Park said while delivering the keynote speech of the “AI Digital and Semiconductor Education" symposium hosted by Doheon Academy of Hallym University.

Speaking about the introduction of ChatGPT developed by OpenAI, and how many big tech companies are "jumping in" to develop their own AI chatbot services, Park said the field is likely to become the new "killer application" and create greater demand for memory chips.

"As these AI services spring up and related technologies evolve, the need for data creation, data storage and processing capabilities will increase exponentially," Park said.

"Following the trend, HMB, the fastest DRAM first developed by SK hynix, plays a major role to support technological development in the field of AI."

In the speech, Park highlighted the rising importance of pooled memory, and also suggested the idea of establishing "mini fabs" to reinvigorate the chips industry. Minimal fabs are small microtech and chips manufacturing systems that do not have a clean room.

SK hynix began mass-producing HBM3, its latest high bandwidth memory product, in June 2022. It has been used for operating supercomputers and advanced AI technology. It is known to be the fastest DRAM, with the biggest capacity. The company supplies its HBM3 products for Nvidia.



By Jo He-rim (herim@heraldcorp.com)
MOST POPULAR
LATEST NEWS
leadersclub
subscribe
피터빈트