Friday, April 25, 2025

Your 'Please' and 'Thank You' Cost OpenAI Millions, Sam Altman Reveals

In the rapidly evolving world of artificial intelligence, even seemingly small gestures of human courtesy towards chatbots like ChatGPT come with a price tag. OpenAI CEO Sam Altman recently revealed that users saying "please" and "thank you" to the company's AI models is costing "tens of millions of dollars". While the notion of politeness having a significant financial impact on a tech giant might seem surprising, experts explain that this cost is a consequence of how these powerful AI systems operate on an immense scale.


Sam Altman OpenAI CEO

How AI Processes Language (And Politeness)

Understanding the cost involves looking into the technical underpinnings of AI chatbots. Large language models (LLMs) like ChatGPT process text by breaking it down into smaller units called tokens. These tokens can be words, parts of words, or even punctuation marks. When a user inputs a prompt, the AI processes each token, requiring computational resources like processing power and memory housed in massive data centers. Generally, more tokens in a prompt require more computational resources.

Polite phrases such as "please" and "thank you" typically add a small number of tokens to a user's input, usually between two and four tokens in total. According to OpenAI's API pricing, which charges based on token usage, the cost per million tokens varies by model. For example, the GPT-3.5 Turbo model has an input cost of $0.50 per million tokens. Based on this, adding three tokens for politeness to a single prompt costs an exceedingly small amount – roughly $0.0000015.

 

Scale: The Reason for the Millions

So, how does a cost of a fraction of a cent per interaction balloon into "tens of millions of dollars"? The answer lies in the sheer volume of daily usage. ChatGPT handles over one billion queries daily. When a minuscule cost per interaction is multiplied by billions of interactions, it accumulates into a substantial aggregate figure.

Operating LLMs like ChatGPT requires a vast infrastructure of data centers, high-performance servers, and specialized processing chips (GPUs). These facilities consume substantial amounts of energy. Estimates suggest the daily energy cost to run ChatGPT could be around $700,000. A single query to GPT-4 is estimated to consume about 2.9 watt-hours of electricity, significantly more than a standard search. Scaling this across billions of daily queries results in millions of kilowatt-hours consumed daily. Beyond electricity, these data centers also require significant water for cooling systems. Generating a short 100-word email with GPT-4, for instance, can use as much as 519 milliliters of water for cooling the servers involved. The cost of processing polite language is embedded within this broader framework of infrastructure, energy, and water expenses.

While some expert analyses, based purely on token costs for models like GPT-3.5 Turbo, estimate the annual cost of politeness to be significantly lower, around $146,000, Altman's statement of "tens of millions" likely reflects the broader increase in computational load and associated energy consumption across their vast infrastructure. Even at "tens of millions," this figure represents a non-negligible part of the overall operational costs for ChatGPT, which are estimated to be in the hundreds of millions annually.

 

Why Users Are Polite, And Why It Might Matter

Despite the cost, a significant portion of users are polite to AI. A late 2024 survey found that 67 percent of US respondents reported being nice to their chatbots. Among those, 55 percent said they did it "because it's the right thing to do," while 12 percent cited appeasing the algorithm, perhaps out of fear of a future AI uprising. About two-thirds of people who are impolite said it was for brevity.

Moreover, experts suggest that being polite to AI might offer benefits beyond simple etiquette. Microsoft's design manager Kurtis Beavers noted that using proper etiquette helps generate "respectful, collaborative outputs," explaining that polite language "sets a tone for the response". A Microsoft WorkLab memo added that generative AI mirrors the levels of professionalism, clarity, and detail in user prompts. Beavers also suggested that being polite ensures you get the same graciousness in return and improves the AI's responsiveness and performance.

Research hints that polite and well-phrased prompts could lead to higher quality and less biased AI outputs, with one study finding a 9% improvement in AI accuracy. Additionally, current interactions with AI are seen as contributing to the training data for future models. Polite exchanges might help train AI to default towards helpfulness, whereas curt interactions could reinforce transactional behavior, potentially shaping the ethical frameworks of future AI. OpenAI CEO Sam Altman's comment that the tens of millions spent on politeness were "well spent" and his cryptic "you never know" remark could indicate that OpenAI sees long-term strategic value in fostering these more natural interactions.

 

Contextualizing the Cost and Environmental Impact

Ultimately, while politeness does add to the computational load and contributes to energy consumption, the cost per individual user remains minimal. The significant expense arises from the aggregate effect across billions of interactions.

However, the discussion underscores a broader issue: the substantial environmental footprint of AI. Data centers powering AI already consume around 2 percent of the world's energy, a figure projected to increase dramatically. Those "pleases" and "thank yous," while seemingly small, contribute to this growing energy demand. This reality has led some to suggest that for tasks like writing a simple email, the most environmentally conscious choice might be to bypass the chatbot entirely and write it yourself.

As AI becomes more integrated into daily life, the balance between optimizing computational efficiency and fostering positive, human-like interactions remains a key consideration. The debate over the cost of politeness highlights this intersection of technical performance, economic reality, environmental impact, and evolving human-AI relationships.

No comments:

Post a Comment