Recently, reports have emerged indicating that OpenAI is grappling with escalating inference cost problems. As the startup that has burned through cash at an unprecedented rate in history, the costs associated with running its large language models may not be sustainable based on its revenue.
According to pertinent documents, OpenAI is incurring substantial model inference costs on Microsoft's Azure cloud platform. Take the first half of 2025 as an example; its expenditures during this period came close to a staggering US$5 billion. In contrast, the 'cost of revenue' reported by the media for the same time frame was a mere US$2.5 billion. This reveals a cash - burning rate that is nearly three times the publicly disclosed figures.
Furthermore, the documents also disclose that OpenAI hands over 20% of its revenue as a share to its major investor, Microsoft. By working backwards from this data, we can deduce that OpenAI's actual revenue is significantly lower than what has been previously reported. For instance, the revenue share received by Microsoft in the first half of 2025 implies that OpenAI's revenue was approximately US$2.27 billion. This is in stark contrast to the US$4.3 billion figure reported by the media.
Moreover, from the first quarter of 2024 to the third quarter of 2025, OpenAI's inference computing expenditures on Azure surpassed US$12.4 billion. Meanwhile, its revenue growth during the same period lagged far behind the growth in costs. These figures clearly show a substantial gap between OpenAI's operational costs and its revenue, raising serious doubts about the sustainability of its business model.
