(Viralyft/ UNSPLASH)

Being polite to ChatGPT costs millions of dollars

Add your Headline Text Here
@fyinews team

22/04/2025

Copy link
fyi:
  1. OpenAI CEO, S. Altman, admitted on X that “please” and “thank you” messages on ChatGPT cost tens of millions of dollars.
  2. Each message on ChatGPT-4 uses about 2.9 watt-hours of electricity (10 times more than a Google search), with the platform being used by 800 million users per week.
  3. A 2024 study found that 67% of AI chatbot users in the U.S. are polite, with 12% saying they do so because they fear a potential “AI uprising.”

News


OpenAI CEO Sam Altman acknowledged on X that users sending “please” and “thank you” messages to ChatGPT costs tens of millions of dollars in electricity, given that millions of users are sending them. Each message on ChatGPT-4 uses about 2.9 watt-hours of electricity (10 times more than a Google search), and the platform is used by 800 million users weekly.

Data centers for AI operations currently account for approximately 2% of global electricity consumption

A 2024 study revealed that 67% of AI chatbot users in the U.S. are polite in their conversations, with 55% doing so because they believe it’s “the right thing,” while 12% remain polite as a “precaution” against a potential “AI uprising.” The research also indicates that polite prompts can improve the quality of responses from AI chatbots.

Experts are concerned that AI will strain the environment, as data centers for AI operations already account for about 2% of global electricity consumption. For instance, an AI model uses 0.14 kilowatt-hours of energy to write a small email, equivalent to the energy consumed by 14 LED light bulbs for one hour. According to new research, the energy demand of data centers is expected to quadruple by the end of this decade (consuming as much energy as Japan does today), but only half of this energy is expected to come from renewable sources.

AD(1024x768)