본문 바로가기
bar_progress

Text Size

Close

A Former Nvidia Executive's Challenge: "AI Will Flow Like Electricity and Water" (Comprehensive)

Baro Space Unveiled in Pyeongtaek, Gyeonggi Province
Standardized by Modular Cell Units for Enhanced Scalability
Targeting Universities and Research Institutions, with Plans for Overseas Expansion

Lee Yongdeok, CEO of Baro AI, who served as the head of Nvidia Korea for 13 years, has established a compact, modular artificial intelligence (AI) computing center designed for universities and research labs.

A Former Nvidia Executive's Challenge: "AI Will Flow Like Electricity and Water" (Comprehensive) Lee Yongdeok, CEO of Baro AI, is speaking about the importance of AI infrastructure at the opening ceremony of "Baro Space" held on the 29th in Pyeongtaek, Gyeonggi Province. Photo by Kim Bokyung

On the 29th, Baro AI, a company specializing in AI infrastructure, officially opened its urban-type compact data center "Baro Space" in Pyeongtaek, Gyeonggi Province, unveiling an AI computing center equipped with 400 GPUs and utilizing a liquid cooling system.


Baro AI plans to use Baro Space for its GPUaaS (GPU as a Service) business and to showcase its proprietary HACC (Hybrid Modular AI Computing Center) technology.


CEO Lee stated, "HACC is a shared electric infrastructure for the AI era," adding, "We aim to provide infrastructure that allows anyone to start using AI immediately."


The newly unveiled AI computing center can be flexibly installed close to the field in small units (cells). By standardizing a cell that houses 400 GPUs, it can be replicated and expanded like Lego blocks at university campuses, knowledge industry centers, research institutes, and more.


Unlike hyperscale data centers, which are restricted by location and require significant costs and time to establish, this solution is expected to enable rapid installation and operation in AI research sites or urban environments.


CEO Lee explained, "It costs about 10 billion to 30 billion won to build one cell equipped with 400 GPUs, including servers, cooling, power, software, and construction. If GPU demand is lower, it is possible to start with less than 10 billion won by installing only the necessary number of servers."

A Former Nvidia Executive's Challenge: "AI Will Flow Like Electricity and Water" (Comprehensive)

HACC primarily targets universities, research institutions, companies developing AI models, local governments, and public agencies. CEO Lee said, "One cell operates on 250 kW of power, which is more than sufficient for the power grids already available in standard buildings or knowledge industry centers."


Additionally, by applying a liquid cooling system, power consumption can be reduced by 30 to 35 percent, and GPU temperatures can be maintained at 50 to 60 degrees Celsius. Even at full load, the noise level is around 39 dB, making it quieter than a library.


CEO Lee emphasized, "HACC is a data center that can be installed inside urban buildings. It is not just a simple cooling technology, but a core technology that enables AI computing even in urban and edge environments."


Notably, Baro AI provides an integrated package that includes its self-designed GPU server (Poseidon), cluster management software (Baro Flex), and data center (Baro Space), allowing for rapid and stable responses in case of failures.


CEO Lee said, "The world we envision is one where AI flows as naturally as electricity or water. Our goal is to create a society where, whenever someone wants to use AI, they can access as much computing as needed with the turn of a tap, without complicated procedures."


Baro AI plans to expand its export-oriented HACC model to regions with weak energy infrastructure, such as Southeast Asia, the Middle East, and Central and South America. The company is pursuing AI infrastructure distribution projects linked to government aid funds, aiming to build a global sovereign AI network through the export of Korea's AI technology and infrastructure.


CEO Lee assessed, "Although the world is focusing on hyperscale data centers, considering the rapid pace of GPU generation replacement and the risks of utilization rates, concentrating all AI workloads in a central location is inefficient." He added, "Starting small and expanding according to demand, while reducing operational costs through a high-efficiency liquid cooling structure, is a realistic alternative."


© The Asia Business Daily(www.asiae.co.kr). All rights reserved.


Join us on social!

Top