본문 바로가기
bar_progress

Text Size

Close

FriendlyAI Launches Service to Support Use of Generative AI Models

FriendlyAI, a startup in the generative artificial intelligence (AI) inference serving field, announced on the 4th the launch of its 'Friendli Serverless Endpoints' service, which makes it easy to utilize generative AI models.


FriendlyAI Launches Service to Support Use of Generative AI Models

Friendli Serverless Endpoints helps users easily access the latest generative AI models. It solves issues related to the complex infrastructure installation and operation processes, which previously required a large amount of manpower and time, as well as GPU optimization for model serving in generative AI applications.


Through this, anyone can easily integrate popular generative AI models such as Llama2, Stable Diffusion, CodeLlama, and Mistral into their own services at low cost and high speed, enabling functionalities like text generation and image generation.


Compared to other solutions using open-source frameworks, Friendli Serverless Endpoints generates responses up to four times faster, providing a smooth and fast generative AI experience. It supports low cost and high speed with its serving engine, the Friendli Engine. This engine reduces the number of GPUs required for serving to about one-seventh compared to existing solutions.


Jeon Byeong-gon, CEO of FriendlyAI, said, "We believe the future of generative AI lies in making the technology easily accessible and usable by everyone, and we developed this new service to provide open-source generative AI models faster and more affordably."


© The Asia Business Daily(www.asiae.co.kr). All rights reserved.


Join us on social!

Top