Credit: Benj Edwards / OpenAI / Getty Images
OpenAI is set to produce its own artificial intelligence chip for the first time next year, as the ChatGPT maker attempts to address insatiable demand for computing power and reduce its reliance on chip giant Nvidia.
The chip, co-designed with US semiconductor giant Broadcom, would ship next year, according to multiple people familiar with the partnership.
Broadcom’s chief executive Hock Tan on Thursday referred to a mystery new customer committing to $10 billion in orders.
OpenAI’s move follows the strategy of tech giants such as Google, Amazon and Meta, which have designed their own specialised chips to run AI workloads. The industry has seen huge demand for the computing power to train and run AI models.
OpenAI planned to put the chip to use internally, according to one person close to the project, rather than make them available to external customers.
Last year it began an initial collaboration with Broadcom, according to reports at the time, but the timeline for mass production of a successful chip design had previously been unclear.
On a call with analysts, Tan announced that Broadcom had secured a fourth major customer for its custom AI chip business, as it reported earnings that topped Wall Street estimates.
Broadcom does not disclose the names of these customers, but people familiar with the matter confirmed OpenAI was the new client. Broadcom and OpenAI declined to comment.
Tan said the deal had lifted the company’s growth prospects by bringing “immediate and fairly substantial demand,” shipping chips for that customer “pretty strongly” from next year.
The prospect that custom AI chips will take a growing share of the booming AI infrastructure market has helped propel Broadcom’s shares more than 30 percent higher this year. They rallied almost 9 percent in pre-market trading in New York on Friday.
HSBC analysts have recently noted that they expect to see a much higher growth rate from Broadcom’s custom chip business compared with Nvidia’s chip business in 2026.
Nvidia continues to dominate the AI hardware space, with the Big Tech “hyperscalers” still representing a significant share of its customer base. But its growth has slowed relative to the astronomical figures it saw at the start of the AI investment boom.
OpenAI chief executive Sam Altman has been vocal about the demand for more computing power to serve the number of businesses and consumers using products such as ChatGPT, as well as to train and run AI models.
The company was one of the earliest customers for Nvidia’s AI chips and has since proven to be a voracious consumer of its hardware.
Last month, Altman said the company was prioritising compute “in light of the increased demand from [OpenAI’s latest model] GPT-5” and planned to double its compute fleet “over the next 5 months.”
© 2025 The Financial Times Ltd. All rights reserved Not to be redistributed, copied, or modified in any way.