Google Launches Dedicated Chips for AI Training and Inference, Challenging NVIDIA Once Again
13 hour ago / Read about 0 minute
Author:小编   

After years of developing chips capable of both training AI models and handling inference tasks, Google has announced that it will split these two tasks across different processors, introducing two new products in its eighth-generation Tensor Processing Units (TPUs)—the TPU 8t and TPU 8i. These are optimized for AI model training and inference tasks, respectively, and are expected to be officially launched later this year. The TPU 8t offers a 2.8x performance improvement over its predecessor at the same price, while the TPU 8i delivers an 80% performance boost. This move represents Google's latest strategy to compete with NVIDIA in the AI hardware sector.