OpenAI Diversifies AI Chip Usage, Integrates Google TPUs for the First Time
2 day ago / Read about 0 minute
Author:小编   

OpenAI has recently ventured into new territory by renting Google's AI chips, known as Tensor Processing Units (TPUs), to bolster services like ChatGPT. This marks a significant shift for the company, which had previously relied almost exclusively on NVIDIA's Graphics Processing Units (GPUs). The objective of this collaboration is twofold: to minimize inference costs and to lessen dependency on Microsoft's data centers. Notably, Google has not offered its most potent TPU version for this partnership, yet the move is still perceived as a surprising collaboration between two prominent players in the AI industry. Concurrently, Google is vigorously promoting its TPUs in an effort to attract new clientele, including Apple and Anthropic.