Recently, a provocative idea has gone viral online: users should avoid saying "thank you" to AI, on the grounds that it wastes computing power and electricity. This notion traces back to a 2025 public statement by Sam Altman, CEO of OpenAI, who revealed that polite phrases like "thank you" and "please" input by users were adding tens of millions of dollars in annual electricity costs for the company.
From a technical perspective, AI processes such phrases by breaking them down into "tokens" for pattern matching and probability calculations, consuming roughly 0.0003 kilowatt-hours of electricity per interaction. With ChatGPT boasting 123 million daily active users, if each person says "thank you" once a day, the annual electricity consumption could reach 13.46 million kilowatt-hours—equivalent to the yearly electricity usage of 7,000 households. Moreover, server cooling processes, necessary to maintain optimal operating temperatures, consume additional water resources. For instance, generating a simple response like "you're welcome" requires 44 milliliters of cooling water, and large-scale interactions could strain regional water resource allocation.
However, advancements in technology have mitigated these concerns. Mainstream AI models, such as Baidu's ERNIE Bot and Alibaba's Tongyi Qianwen, now feature social vocabulary filtering capabilities. These systems can identify meaningless or polite instructions and directly retrieve locally preset responses, bypassing the need to trigger the full large model inference process. This approach significantly reduces redundant computing power consumption.
Currently, AI's electricity costs are primarily driven by parameter expansion during the model training phase and complex task processing during the inference phase, rather than by individual polite phrases. Therefore, the decision to express gratitude to AI requires a careful balance between enhancing user experience and optimizing resource usage.
