Sam Altman: Polite Expressions Toward AI
Lead to Massive Server Load and Electricity Consumption
"Responding with 'Thank You' Costs Hundreds of Millions in Electricity Bills"
It has been pointed out that saying "thank you" after using artificial intelligence (AI) services such as ChatGPT leads to massive electricity waste. Since AI performs computations based on the number of words input, adding unnecessary greetings increases the amount of data to be processed and the number of responses generated, which in turn raises electricity consumption.
Sam Altman, CEO of OpenAI, recently responded to a user's question on X (formerly Twitter), who asked, "Does it cost electricity when people repeatedly say 'please' and 'thank you' to ChatGPT?" Altman replied, "It has resulted in tens of millions of dollars in electricity bills." He further explained that as the number of words included in a user's request or the frequency of requests increases, the amount of data that servers need to process and the number of responses generated also rise, leading to greater electricity consumption.
It has been pointed out that saying "thank you" after using artificial intelligence (AI) such as ChatGPT causes massive electricity waste. Pixabay
Altman elaborated that electricity consumption increases because the amount of data that servers need to process and the number of responses generated both rise depending on the number of words included in a user's request or the frequency of requests. For example, after making a simple request to ChatGPT and then writing "Thank you," the AI might respond with "Feel free to reach out whenever you need." While these are short exchanges, when accumulated, the total electricity usage can reach a significant level.
According to a Washington Post investigation, generating a 100-word email requires 0.14 kWh of electricity, which is enough to power 14 LED bulbs for one hour. A report from the University of California, Riverside, found that generating a sentence such as "You are welcome" with a large language model (LLM) consumes about 40 to 50 ml of water. Data centers that power AI chatbots are known to account for 2% of global electricity consumption.
In reality, a significant number of users interact politely with AI. According to a survey conducted late last year by global media group Future PLC, 67% of respondents in the United States and 71% in the United Kingdom said they converse with chatbots in a courteous manner. The most common reason given was "because it is morally right," cited by 55% of Americans and 59% of Britons. Meanwhile, "because I am afraid AI might rebel" was chosen by 12% of respondents in both the United States and the United Kingdom.
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

