Why ChatGPT Costs Millions to Say "Please" and "Thank You"

A playful question on social media recently turned into a surprisingly serious conversation about the real cost of interacting with AI and it’s a story every business relying on digital tools should hear.
It started when an X (formerly Twitter) user asked, tongue firmly in cheek:
“I wonder how much OpenAI has lost in electricity costs from people saying ‘please’ and ‘thank you’ to their models?”
OpenAI CEO Sam Altman responded dryly:
“Tens of millions of dollars well spent — you never know.”
Behind the humour lies a deeper truth: every polite word you type to AI doesn’t just go into the ether — it’s processed, calculated, and rendered by high-powered servers running 24/7. And at scale, those niceties are starting to add up — financially and environmentally.
The Real Cost of Being Polite to AI
Modern AI like ChatGPT runs on huge infrastructure powered by GPUs (graphics processing units). Every prompt you send — even something as short as “please summarise this” — kicks off a computational process called inference. That’s where the AI interprets your input and generates a tailored response.
For context:
- Saying “Please help me write a CV” takes more compute than just “Write CV”.
- Multiply that by millions of users per day, and it’s easy to see how even minor inefficiencies balloon into tens of millions in additional compute costs.
A 2023 estimate pegged OpenAI’s daily operating cost for its GPT-3 model at $700,000/day. Newer models like GPT-4o are far more capable — and far more expensive to run.
Politeness Has a Carbon Footprint Too
It’s not just about money — the environmental toll is equally concerning.
According to the International Energy Agency (IEA):
- Data centres and AI computing now account for almost 2% of global electricity usage and that could double by 2026.
- Microsoft’s AI expansion increased its annual water usage by 1.7 billion gallons the equivalent of 2,500 Olympic swimming pools.
- Google’s carbon emissions have spiked 48% since 2019, largely due to AI demands.
Every "please", every query, every session contributes — incrementally — to the energy usage behind your friendly AI assistant.
Inference Costs Are Now Outpacing Training
While training an AI model like GPT-4 takes vast resources upfront, inference (daily use) is now the more expensive part.
Why? Because unlike training, which is a one-time effort, inference:
- Happens every time a user sends a prompt
- Requires real-time computation
- Involves millions of interactions every day
What This Means for Businesses and IT Teams
For business users, the polite computing dilemma is more than trivia. It’s a glimpse into the future of enterprise AI strategy. Here's why it matters:
-
Rising Costs for AI Access OpenAI, Anthropic, Google, and others are introducing premium models to offset costs. Free tools are increasingly limited — and £16–£150/month tiers are becoming the norm.
-
Sustainability Questions As ESG becomes central to IT policy, businesses must start factoring in AI carbon costs especially those reporting on environmental impact.
-
Pressure to Optimise Usage Whether it's cutting prompt lengths or reducing dependency on large models, businesses will need to use AI more efficiently, just like any other resource.
So… Should We Stop Saying "Please"?
Not necessarily. As Microsoft’s Kurt Beavers noted, polite prompts can encourage more courteous, context-aware AI replies. In customer-facing roles, that tone matters.
But from a tech operations standpoint, it’s time we became aware of the cumulative cost of small habits and how they ripple through systems, budgets, and the planet.
How Can We Make AI More Efficient?
Here are four emerging solutions:
- ⚙️ Model Optimisation: Developers are refining models with techniques like quantisation to reduce compute demands.
- 🌱 Green Energy: Tech giants are pivoting to renewables but there's a long road ahead.
- 💡 User Education: Teaching users to write concise, effective prompts will help everyone.
- 🧠 Smarter AI Architectures: Smaller, task-specific models may become more viable than massive general-purpose AIs.
Final Thought: Efficiency Is the New Intelligence
As AI becomes a daily business tool, efficiency will be just as important as capability. Whether you're running customer queries, building reports, or generating content and how you interact with AI can affect your costs, carbon footprint, and competitive edge.
At Network Ltd, we're helping UK businesses adopt AI sustainably and strategically with the right tools, practices, and platforms for today’s IT challenges.
👋 Want help integrating AI into your business—without spiralling costs or complexity?
👉 Talk to us about smarter AI strategies for your team.