본문 바로가기
bar_progress

Text Size

Close

Google Cloud Partners with Hugging Face to Accelerate Generative AI Development

Google Cloud Partners with Hugging Face to Accelerate Generative AI Development

Google Cloud announced on the 26th that it has formed a new strategic partnership with Hugging Face, the world's largest machine learning platform.


Through this partnership, open-source developers will be able to utilize Google Cloud's AI-optimized infrastructure, including computing, TPU (Tensor Processing Unit - semiconductor chips for artificial intelligence (AI)), and GPUs (Graphics Processing Units), across all Hugging Face services. Hugging Face plans to provide its open-source platform via Google Cloud.


Hugging Face aims to accelerate its vision of AI democratization, while Google Cloud plans to strengthen support for the development of the open-source AI ecosystem. Google Cloud has been selected as Hugging Face's strategic cloud partner and the preferred provider for training and inference workloads (computing resources required to complete tasks).


Developers can also build new generative AI applications using Google Cloud infrastructure. Additionally, they can train and serve Hugging Face models using Vertex AI on the Hugging Face platform. By leveraging Google Cloud's specialized end-to-end machine learning operations (MLOps) services, generative AI apps can be developed.


Furthermore, with support for Google Kubernetes Engine (GKE) deployment, developers can train, tune, and serve workloads using infrastructure configured by Hugging Face, and scale models using Hugging Face-specific Deep Learning Containers on GKE. Vertex AI and GKE are expected to be offered as deployment options on the Hugging Face platform in the first half of this year.


© The Asia Business Daily(www.asiae.co.kr). All rights reserved.


Join us on social!

Top