Google Cloud expressed confidence in its 6th generation AI chip TPU (Tensor Processing Unit) Trillium, specialized for AI training and development. It has improved efficiency in ultra-large-scale computation processing, and it was reported that the domestic IT company Kakao is also utilizing it.
Google Cloud held a Trillium training session on the 16th. At the event, Mohan Pitchika, Google Cloud Group Product Manager, stated, "TPU supports parts of Google services such as Search and YouTube, which serve billions of users," and explained, "It was also used to train and serve Gemini 2.0, which boasts high performance and functionality."
He continued, "(Trillium is) the most powerful TPU released to date," adding, "The maximum performance per chip has improved by 4.7 times, and thanks to this advancement, it is now possible to solve increasingly complex AI problems and deliver groundbreaking results."
TPU is an AI chip developed in-house by Google, recognized for its specialization in deep learning model training and inference. It has different characteristics compared to NVIDIA’s graphics cards (GPUs), which are specialized for AI computation and graphic tasks. Trillium is known to have more than 4 times the training performance and up to 3 times the inference throughput compared to previous generation TPUs, with a 67% improvement in power efficiency.
Manager Pitchika explained the difference between TPU and GPU: "GPUs originated from computer graphics applications and were designed for specific uses," he said. "They are designed for parallel processing and are suitable for handling large amounts of data. TPUs, on the other hand, are more specialized for computation processing and are suitable for tasks like training and inference."
He also elaborated on AI hypercomputers. Pitchika emphasized, "AI hypercomputers are powerful systems designed to integrate all core components of AI infrastructure," adding, "By optimizing AI workflows, AI hypercomputers support achieving remarkable results much faster and more efficiently than before."
Google Cloud also explained specific cases of TPU utilization. Kakao introduced TPU for LLM training to develop its AI integrated brand 'Kanana.' Google Cloud stated, "Kanana, initially developed using GPUs, faced the challenge of rapidly improving model performance with fewer resources," and added, "Through open models and Trillium, they were able to quickly secure a model capable of high-level Korean language support."
Furthermore, "Kakao has been working with Google since the TPU 3rd generation era," and "They developed the ultra-large AI KoGPT language model using TPU and also explored the compatibility of TPU with multimodal (simultaneous processing of image, video, audio, etc.) datasets through the image generation AI called Karo," he added.
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.


