본문 바로가기
bar_progress

Text Size

Close

Processing 500,000 Laptops per Second... NHN "Leading the Computing Market with AI Data Center"

Equipped with Top-Class Domestic Facilities Including NVIDIA H100
Strength in Shielded Environment to Cool Server Heat
"Aiming to Provide an Environment Where Anyone Can Develop AI Services"

Processing 500,000 Laptops per Second... NHN "Leading the Computing Market with AI Data Center" Exterior view of NHN Cloud's 'National AI Data Center' [Photo by NHN Cloud]

"Please sign the security pledge."


On the 21st, at NHN Cloud's 'National AI (Artificial Intelligence) Data Center' in Oryong-dong, Buk-gu, Gwangju, I signed two security pledge forms and attached security stickers to both the front and back cameras of my mobile phone to enter the building, which spans 2,200㎡ and has two floors above ground. I also wore shoe covers. An explanation came through emphasizing the importance of security, as the infrastructure is optimized for AI research and development.


Upon entering Computer Room 2 on the second floor, the loud noise from the servers was considerable. I understood why earplugs were provided in advance. This place offers infrastructure specialized for AI, with a total of 140 racks each supplied with 15 kilowatts (kW). The power density of 15 kW per rack is equivalent to the power needed to operate nine air conditioners or 98 TVs simultaneously. This is a high level compared to the domestic data center average of 4.8 kW and the average of 10 kW for data centers scheduled to open from this year to next year.


The core infrastructure of the National AI Data Center, Computer Room 2, is built with a height of 7.5 meters. This is to facilitate smooth air circulation to cool the heat emitted from the servers. Cold air continuously flows in through the walls to cool the servers, while hot air circulates to the upper part. This creates a sealed environment where cold and hot air do not mix, achieving highly efficient cooling. On the rooftop, five air-cooled pre-cooling chillers (air conditioning units) are installed, which lower the temperature of the air entering the data center through evaporative latent heat.


NHN Cloud owns about 1,000 NVIDIA H100 GPUs optimized for large language model (LLM) development, including the National AI Data Center and the Pangyo Data Center (NCC1), as well as Graphcore and Sapeon products. This is considered the largest scale in Korea. The AI GPU farm consists of a total of 99.5 petaflops (PF): 77.3 PF based on NVIDIA, 11.2 PF on Graphcore, and 11 PF on Sapeon.


The total computing power available for AI research and development at the National AI Data Center is 88.5 PF, with a storage capacity of approximately 107 petabytes (PB). The 88.5 PF computing power means it can perform calculations equivalent to about 500,000 general-purpose laptops per second. The 107 PB storage capacity corresponds to 107,000 1-terabyte (TB) hard drives.


To prepare for emergencies such as power outages, the center also has four uninterruptible power supplies (UPS) capable of providing 2,000 kW each. It has seismic design and lightning protection systems in place. Additionally, fire automatic detection and suppression facilities, a hotline and air conditioning system with the fire station, and dual communication lines that switch immediately to backup lines in case of disconnection are established as fire safety measures.


Yoon Yong-su, NHN Cloud's technical leader, explained, "The National AI Data Center incorporates the experience and capabilities accumulated over 10 years of operating the existing Pangyo Data Center."


Processing 500,000 Laptops per Second... NHN "Leading the Computing Market with AI Data Center" Kim Dong-hoon, CEO of NHN Cloud, is presenting the strategy at the NHN Cloud 2.0 press conference.
[Photo by NHN Cloud]


NHN Cloud anticipates a shift from CPU-based computing to GPU-based accelerated computing centered on generative AI in the future. Accordingly, they plan to lead the infrastructure market through the National AI Data Center. NHN Cloud has about 200 cloud services and 500 partner companies. They also have secured 5,700 client companies.


In a strategic announcement, Kim Dong-hoon, CEO of NHN Cloud, said, "We expect LLM development services to diversify going forward, and we plan to continuously expand infrastructure-based services. Our goal is to provide an environment where anyone can easily develop AI services and to lead the GPU-based accelerated computing market."


© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

Special Coverage


Join us on social!

Top