Implementation of D2C Water-Cooling System and Immersion Cooling Model Displayed
Featuring NVIDIA B200 and AI Server Racks
Autonomous Robot Monitors Data Center Temperature and Humidity
AI-Dedicated Network and Power Infrastructure Also Established
"At the KT Cloud AI Innovation Center, we can directly demonstrate and apply liquid cooling technology capable of rapidly cooling high-heat AI chips. This site is essentially a real-world demonstration environment, featuring servers and equipment that operate data center infrastructure and showcase our technological capabilities."
On the morning of December 11, at the KT Cloud AI Innovation Center in Yangcheon-gu, Seoul, entering the "infrastructure lab"-which is equipped with facilities identical to those in an actual data center-revealed an environment mirroring a real AI data center. The lab featured NVIDIA's latest graphics processing unit (GPU), the B200, AI server racks, direct-to-chip (D2C) liquid cooling devices, and AI network systems.
Inside the infrastructure lab, servers were operating with a noticeable level of noise. The server racks were equipped with NVIDIA B200 servers, and the infrastructure was arranged to replicate the structure and operation of a real data center. On one side of the center, a water-cooled load unit was running to dissipate heat generated by the GPUs. This unit absorbs the heat emitted by the GPUs, transfers it to the coolant, and then sends the coolant to a cooling tower or similar system to release the heat.
Nearby, an autonomous robot equipped with various sensors was stationed on a charging dock. This robot moves at scheduled intervals to measure temperature and humidity throughout the server room, automatically monitoring for any abnormal readings. During the demonstration, the autonomous robot used sensors attached to its robotic arm to measure temperature and humidity in every corner of the AI server racks.
Hyungman Heo, Head of KT Cloud Data Center Division, emphasized, "The newly unveiled AI Innovation Center is a technology demonstration hub that combines data center infrastructure with a showroom. It is designed not as a one-off, but to enable verification in real-world environments for the advancement of data centers and the AI industry." The center highlights cooling technologies that KT Cloud is currently commercializing and validating.
Particularly noteworthy was the D2C water-cooling system, which has been adopted as a standard by global AI server manufacturers. This technology attaches a plate directly to the GPU chip, allowing coolant to make direct contact. KT Cloud demonstrated the operation of a server-type D2C water-cooling system based on eight 1kW modules through the center. Last month, KT Cloud became the first in Korea to commercialize this technology at the newly opened Gasan AI Data Center.
To achieve this, KT Cloud developed both server-type and rack-type load units in-house and has validated conditions such as coolant flow rate, pressure, and temperature, tailored to the B200 GPU and NVL72 specifications. NVL72 refers to NVIDIA's high-speed GPU interconnect technology, "NVLink," expanded to 72 connections.
The center also showcased a working model of immersion cooling technology, where AI servers are directly submerged in a special liquid to dissipate heat. Heo explained, "Through proof-of-concept testing, we confirmed up to 60% energy savings during actual load tests."
Additionally, in collaboration with AI networking specialist Arista, the center has established an AI-dedicated network based on RoCEv2 (RDMA over Converged Ethernet v2). This network supports ultra-high-speed data transmission between AI servers and features next-generation Ethernet RDMA technology, which offers greater cost efficiency and scalability compared to the traditional InfiniBand high-speed network. RDMA (Remote Direct Memory Access) is a technology that reduces latency and increases speed by enabling direct memory-to-memory data transfers between servers, bypassing the central processing unit (CPU).
KT Cloud is also considering releasing the demonstration data obtained from the AI Innovation Center. Heo stated, "There is still a lack of process data on data center liquid cooling methods in the market," adding, "We are reviewing the possibility of making the data obtained through KT Cloud's research public." The aim is to share information on liquid cooling methods and contribute to the advancement of the entire industry.
KT Cloud CEO Choi Jiwoong said, "The AI Innovation Center is not just a showroom, but a core platform for demonstrating next-generation data center technologies and implementing customer value-centric AI infrastructure. KT Cloud will set a new standard for data center technology innovation and lead the growth of the domestic AI ecosystem."
Hyungman Heo, Head of KT Cloud Data Center Division, is delivering a greeting on the morning of the 11th at the KT Cloud AI Innovation Center in Yangcheon-gu, Seoul. Provided by KT Cloud
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.



