
Key Points
- OpenAI CEO Sam Altman Says Politeness Costs Millions
- Users saying “please” and “thank you” to ChatGPT are racking up big costs
- Sam Altman says it’s costing OpenAI “tens of millions”
- A majority of users treat AI politely, out of ethics or fea
- AI usage costs are dropping, but not fast enough yet
OpenAI CEO Sam Altman has revealed something unexpected—and kind of ironic. Being polite to ChatGPT is making OpenAI bleed money.
On April 16, Altman replied to a user on X (formerly Twitter) asking how much it costs when users say things like “please” and “thank you” to the chatbot. His answer? “Tens of millions of dollars well spent — you never know.”
tens of millions of dollars well spent–you never know
— Sam Altman (@sama) April 16, 2025
That might sound humorous, but it underscores a very real issue: extra words = extra tokens = extra costs. Multiply that by the billions of requests ChatGPT processes, and you get a pretty large electricity and server bill.
This quirky insight comes at a time when OpenAI is scaling rapidly and becoming a key player in the AI race, challenging rivals like Google Gemini, and even AI-driven sound models like Nvidia’s Fugatto. With the company expecting $12.7 billion in revenue in 2025, you’d think a few kind words wouldn’t matter. But at scale, they really do.
When I talk to ChatGPT I end the convo with
“Thank you so much, I hope you have a good day! :)”
That way when ai enslaves humanity I won’t be turned into slave
One of the robots will step up when it’s my turn to get whipped
“Wait! I know him” and save me
— , (@Zvbear) March 28, 2025
Why People Are Being Polite to AI Assistants
There’s a deeper question here: why are people so polite to machines that don’t have feelings?
According to a December 2024 survey by Future, 67% of American AI users are polite to assistants like ChatGPT. Of those, 55% say it’s because it feels like the right thing to do, and another 12% worry that rudeness might come back to bite them if AI ever becomes sentient.
Treating AIs with courtesy is a moral imperative for me. I do it out of self-interest. Callousness in our daily interactions causes our interpersonal skills to atrophy. https://t.co/fNZxeG6kjj
— Carl Youngblood (@cayblood) April 16, 2025
Some users even see it as a reflection of personal ethics. Engineer Carl Youngblood explained, “Treating AIs with courtesy is a moral imperative for me. I do it out of self-interest. Callousness in our daily interactions causes our interpersonal skills to atrophy.”
This sentiment mirrors growing public interest in how AI agents evolve and how humans shape them through interaction. Tools like OpenAI’s deep-research AI agent are designed to understand context better than ever before, which makes the human-AI relationship feel more personal—and, to some, more deserving of courtesy.
Why don’t ChatGPT send cache response for these courtesy words like Thank you, Great Work, Wow Amazing, Nice, etc instead of processing data at high cost though the answer is same for selected courtesy words. This is just my opinion. Thank You https://t.co/UR3Esuo5bF
— Quresh Nawaz (@QureshNawaz) April 20, 2025
It also raises questions about the cultural norms forming around AI. Are we preparing for a future where AI remembers our manners? Or are we just being nice out of habit?
Either way, OpenAI’s servers are paying the price.
The Real-World Cost of Politeness: Power and Processing
Let’s break down how something as small as a “thank you” ends up on OpenAI’s balance sheet.
A September 2023 report by Digiconomist’s Alex de Vries estimated that each ChatGPT query consumes around 3 watt-hours of electricity. That’s about the same as charging your phone for 30 minutes. However, Epoch AI’s Josh You claims that’s an overestimate, saying more efficient models bring it down to 0.3 watt-hours.
Even at that lower estimate, if millions of users add 10-15 extra words every time they use ChatGPT, that small difference adds up fast.
Some users on X even asked Altman why ChatGPT can’t just ignore or filter out polite words like “please” and “thanks” to save energy. Technically, it could. But doing so might create a worse user experience, which would go against the core of OpenAI’s mission: to make AI feel natural and helpful.
This conversation also taps into a growing debate about AI’s environmental impact, similar to concerns raised during the crypto IPO wave involving firms like Gemini, Kraken, and BitGo. Just like blockchain, AI is now being scrutinized for its massive power consumption and what that means for sustainability.
OpenAI’s Growth vs Its Ongoing Costs
While politeness might be draining funds, OpenAI is growing at breakneck speed. The company expects to more than triple its revenue this year, hitting $12.7 billion despite the rising cost of operation and growing competition.
That said, Altman has been open about the fact that OpenAI won’t be cash-flow positive until 2029. By then, they forecast annual revenue will exceed $125 billion, driven by new AI applications, research agents, and enterprise products.
And even though AI usage is rising, Altman says the cost of generating AI output is dropping fast—by a factor of 10 every year. With more efficient models, smarter hardware, and advanced platforms like Google’s Gemini AI and OpenAI’s own research agents, there’s hope that polite habits won’t be so financially damaging in the future.
Until then, every kind word comes at a price.