본문 바로가기
bar_progress

Text Size

Close

How Much Would AI Cost? [Tech Talk]

OpenAI Establishes Initial Business Model
On Defensive Against Google's Aggressive Pricing Strategy
Upcoming Big Tech AI 'Chicken Game'

Before that, what should be the criteria for 'selling' an AI chatbot? The number of words generated by AI? Or the number of times a user asks questions to the AI?


Defining the business model for AI is as challenging as creating the AI itself. Especially as competition intensifies between big tech companies and startups developing generative AI, the industry's focus has begun to shift toward 'pricing.'


OpenAI, Pioneer of AI Commercialization
How Much Would AI Cost? [Tech Talk] Sam Altman, CEO of OpenAI

The company that led the monetization of AI chatbot services is OpenAI, the developer of ChatGPT. They already sell GPT-4.0-based chatbots to individual and corporate accounts through premium subscription services.


However, this is a 'finished product' service aimed at consumers. OpenAI also runs a B2B business that allows other companies to integrate their AI model APIs (Application Programming Interfaces). In this case, pricing is based on 'tokens.'


Tokens are the smallest units of characters that large language models (LLMs) like ChatGPT can recognize. In other words, you can think of them as the words AI can generate. OpenAI currently sells 1,000 tokens of the 'GPT-4 Turbo' model (approximately 750 English words) for $0.03 (about 40 KRW).


At around 40 KRW for 750 English words, it seems quite inexpensive. But imagine AI becoming fully commercialized and deployed in all face-to-face services. Hundreds of millions of people would ask AI questions daily, and AI would generate an astronomical number of tokens. Considering the scalability of AI business, OpenAI is not exactly giving away ChatGPT for free.


Google's Radical Pricing Policy Shakes the AI Ecosystem
How Much Would AI Cost? [Tech Talk] Google 'Gemini' related images / [Image source=Google Official Blog]

Since OpenAI started selling its models based on tokens, many latecomers have adopted token-based business models. Anthropic AI and Mistral AI are representative examples.


However, recently Google introduced a pricing policy based on 'queries' instead of tokens. A query refers to the act of requesting specific information from a database.


How Much Would AI Cost? [Tech Talk] The price of Gemini Pro is set at $0.0005 per 1000 characters. Note that characters differ somewhat from tokens, which are the minimum units of text recognized by AI.
[Image source=Google Blog]

Google offers its models through 'Google AI Studio' and charges only companies that exceed 60 queries per minute. In other words, if you use AI at a frequency of less than one query per second, Google's latest AI models are effectively free. Later, through its own AI development platform 'Vertex AI,' Google plans to set a price of $0.0005 (0.65 KRW) per 1,000 characters generated.


For reference, when converted to a token basis, the Gemini Pro version costs about $0.002 (approximately 2.62 KRW) per 1,000 tokens. This means it is more than 10 times cheaper than GPT-4 Turbo.


Google's pricing policy is not only incomparably cheaper than other AI companies including OpenAI, but it can also act as a centripetal force attracting individuals, research institutes, and startups that cannot afford expensive costs into the Google ecosystem.


The Upcoming Chicken Game
How Much Would AI Cost? [Tech Talk] The chicken game refers to a phenomenon where companies engage in price competition, willing to endure losses to gain an advantage in market share.

Monetization is the most important step to open the AI era. AI is not something that can be used for free. To 'deploy' a completed AI model on smartphones or laptops, AI inference infrastructure must be built. This requires numerous graphics processing units (GPUs), data centers, network facilities, and power. AI companies must be able to bear these costs for the industry to continue developing sustainably.


However, Google's recent move reminds us that not all AI companies start from the same starting line. Google is already the largest AI company in the world and has the related infrastructure and engineering workforce to realize 'economies of scale' in the AI business.


As the world's largest search portal operator, Google currently processes 99,000 search queries every second. That amounts to 5.9 million queries per minute, 8.5 billion queries every 24 hours, and 2 trillion queries annually. Google's technological expertise in managing such massive network traffic is directly applied to AI commercialization.


Of course, OpenAI, supported by Microsoft (MS), another big tech rivaling Google, will not back down here. In fact, since starting the commercialization of ChatGPT last March, OpenAI has continuously lowered prices. As long as MS provides funding and infrastructure support, OpenAI will have the capacity to compete with Google going forward.


This situation poses a significant challenge for other companies, especially startups and small firms that cannot realize economies of scale like big tech. Startups lacking infrastructure, personnel, and capital cannot survive the chicken game by simply creating 'good AI.' It is urgent to devise new business models that can circumvent the price wars of giant companies and coexist with consumers.


© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

Special Coverage


Join us on social!

Top