OpenAI CEO Sam Altman acknowledged on X that users sending “please” and “thank you” messages to ChatGPT costs tens of millions of dollars in electricity, given that millions of users are sending them. Each message on ChatGPT-4 uses about 2.9 watt-hours of electricity (10 times more than a Google search), and the platform is used by 800 million users weekly.
Data centers for AI operations currently account for approximately 2% of global electricity consumption
A 2024 study revealed that 67% of AI chatbot users in the U.S. are polite in their conversations, with 55% doing so because they believe it’s “the right thing,” while 12% remain polite as a “precaution” against a potential “AI uprising.” The research also indicates that polite prompts can improve the quality of responses from AI chatbots.
Experts are concerned that AI will strain the environment, as data centers for AI operations already account for about 2% of global electricity consumption. For instance, an AI model uses 0.14 kilowatt-hours of energy to write a small email, equivalent to the energy consumed by 14 LED light bulbs for one hour. According to new research, the energy demand of data centers is expected to quadruple by the end of this decade (consuming as much energy as Japan does today), but only half of this energy is expected to come from renewable sources.