Being Polite to ChatGPT Costs OpenAI Tens of Millions in Energy and Water

OpenAI CEO Sam Altman has revealed that being polite to ChatGPT – by saying “please” and “thank you” – costs the company a small fortune.
In a reply to a user on X (formerly Twitter) who asked whether being polite to the AI was racking up electricity bills, Altman said: “Tens of millions of dollars well spent — you never know.”
While that figure might be an exaggeration, it’s true that every single interaction with ChatGPT requires a real-time response from extremely high-powered computing systems. And that’s going to rack up the electricity.
The burgeoning field of AI already accounts for around 2% of the world’s electricity use and, according to Goldman Sachs, uses around ten times more energy per query than a regular Google search would.
If one in ten working Americans started using GPT-4 once a week for a year, the energy consumed would be equal to all the power used by households in Washington D.C. for 20 days.
Furthermore, Rene Hass, the CEO of ARM Holdings, recently warned that AI could account for a quarter of power consumption in the United States by 2030. It currently accounts for 4%.
Of course, being polite also costs in terms of water too. Cooling those servers takes water and a study from the University of California, Riverside found that generating just 100 words with GPT-4 could use as much as three bottles of water.
A simple three-word response like “You are welcome” could use about 1.5 ounces.