Samsung, SK Ramp Up AI Semiconductor Development in Response to ChatGPT
Samsung Electronics and SK hynix are speeding up the development of next-generation semiconductor technology in line with the artificial intelligence (AI) era which is growing faster thanks to the emergence of ChatGPT.
According to the industry on May 14, Samsung Electronics recently developed the industry’s first 128GB CXL D-RAM that supports Compute Express Link (CXL) 2.0.
CXL is a next-generation interface for more efficiently utilizing accelerators, D-RAM, and storage devices used in conjunction with central processing units or CPUs in high-performance server systems. It is characterized by the integration of several interfaces into one which enables direct communications and memory sharing among devices.
The new product is the first in the industry to support memory pooling. This is a technology that allows server platforms to group multiple pieces of CXL memory into a pool and multiple hosts can share memory from each pool as needed. This enables the full capacity of CXL memory to be utilized with no idle areas, resulting in more efficient memory usage.
Despite the explosive growth of data in metaverse, artificial intelligence (AI), and big data services in recent years, existing DDR interfaces have limits in the amount of DRAM that can be installed in a system, so there is an ongoing need for next-generation memory solutions such as CXL DRAM that can compensate for this.
Samsung Electronics developed the world’s first CXL DRAM based on CXL 1.1 in May last year, and then succeeded in developing a 128GB D-RAM supporting CXL 2.0 within a one-year period.
SK hynix also developed its first CXL memory sample in August last year, and in October of the same year, took the wraps off its Computational Memory Solution platform that enables the integration of machine learning and computational functions into the sample.