Four Times the Year-End Target
To Be Provided to Industry, Academia, and Research Starting Next Year
Kakao announced on December 29 that it has completed early deployment of 2,040 out of the 2,424 NVIDIA B200 graphics processing units (GPUs) secured through the government’s GPU procurement project at its own data center.
In August, Kakao was selected as the final operator for the government’s “GPU Procurement Project,” a national initiative supporting the private sector with key infrastructure to achieve the goal of becoming a top three global AI powerhouse. Since then, Kakao has rapidly built out its latest GPU infrastructure and is now actively supporting domestic AI research and development. The company has secured a total of 2,424 NVIDIA B200 GPUs, which will be operated under commission for five years to support the domestic AI research and development environment.
Kakao is building a large-scale GPU infrastructure based at the “Kakao Data Center Ansan” in Ansan, Gyeonggi Province. Leveraging its own data center management capabilities and experience in building GPU clusters, the company has significantly accelerated deployment compared to its initial plan. Currently, Kakao has completed the deployment of 255 nodes and 2,040 GPUs, which accounts for about 84% of the total allocation. This exceeds the original year-end target of 64 nodes by more than four times.
The advanced infrastructure at the Ansan data center is a key factor behind this early deployment. Kakao strengthened project management throughout the entire process, from GPU procurement to deployment and operational preparation, and secured core equipment early through cooperation with suppliers. The company also conducted a pre-technical verification (PoC) to identify potential risks before the actual operation stage.
Kakao also proactively established power and cooling infrastructure for high-density GPU server operations. At the Ansan data center, a Hot Aisle Containment System was implemented to efficiently manage the heat generated by high-performance GPU servers, thereby improving cooling efficiency.
Kakao is not only providing hardware infrastructure but also supporting the software environment necessary for AI model development. Through an integrated platform linked to the National AI Computing Resource Support Portal, users can easily access Kakao Cloud, operated by Kakao Enterprise, and the company provides the AI platform “Kubeflow.”
Kubeflow is a platform that supports the entire process of developing, training, deploying, and inferring machine learning workflows in a cloud-native environment based on Kubernetes. It helps researchers automate workflows and efficiently utilize cloud resources.
Kakao is currently conducting network and performance tests on the 255 nodes already deployed. Starting January 2 of next year, the company will provide the latest computing resources to industry, academia, and research projects selected through a beta service contest by the Ministry of Science and ICT and the National IT Industry Promotion Agency.
Kim Sewoong, Kakao’s AI Synergy Performance Leader, said, “Stable deployment and operation of large-scale GPU infrastructure is essential to AI competitiveness. Based on Kakao’s data center and cloud capabilities, we will provide a stable and efficient AI development environment and contribute to the growth of the domestic AI ecosystem.”
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.


