본문 바로가기
bar_progress

Text Size

Close

More Unveils AMD-Based Distributed Inference System at US "AI Infrastructure Summit"

Global IT Leaders Gather to Share Strategies
"Conducting Proof-of-Concept Projects with Multiple LLM Companies"

AI infrastructure solutions company More participated in the "AI Infrastructure Summit 2025," held at the Santa Clara Convention Center in the United States from September 9 to 11 (local time), where it showcased its AMD-based distributed inference system.

More Unveils AMD-Based Distributed Inference System at US "AI Infrastructure Summit"

The AI Infrastructure Summit originated as the AI Hardware Summit and has since evolved into a specialized event focused on full-stack AI infrastructure. Major global IT companies from various fields, including semiconductors, systems, and hyperscalers, gather in one place to share the latest infrastructure technologies and strategies. This year, more than 3,500 participants and over 100 partner companies took part, engaging in discussions across four sectors: hardware and systems, enterprise AI, edge AI, and data centers.


Kangwon Cho, CEO of More (pictured), delivered a keynote presentation during the enterprise AI session, introducing More’s distributed inference system and presenting benchmark results that demonstrated more efficient optimization of cutting-edge deep learning models such as DeepSeek compared to Nvidia. He also unveiled, for the first time, a next-generation AI semiconductor system that combines More’s software with Tenstorrent hardware, and proposed a variety of alternatives with superior price competitiveness compared to Nvidia.


At the event, More held a joint presentation with SGLang, a leader in the deep learning inference software ecosystem, and organized a booth and networking events. Leveraging this opportunity, the company plans to further strengthen its collaboration with the global AI ecosystem, particularly in the North American market.


CEO Cho stated, “More possesses the strongest technical capabilities among AMD’s global software partners and is currently conducting proof-of-concept (PoC) projects with several leading large language model (LLM) companies. Moving forward, we will leverage close cooperation with AMD, Tenstorrent, and SGLang to become a global company that provides customers with a wide range of AI computing alternatives.”


More is developing the core engine of AI infrastructure in-house and, through its foundation LLM-focused subsidiary Motif Technologies, has secured comprehensive technological capabilities that extend to the model domain.


© The Asia Business Daily(www.asiae.co.kr). All rights reserved.


Join us on social!

Top