Qualcomm Unveils AI200 and AI250, Elevating AI Inference Solutions for Data Centers
2025-10-28 / Read about 0 minute
Author:小编   

According to Qualcomm's official announcement, the company has recently rolled out a new generation of AI inference optimization solutions tailored for data centers. These solutions encompass accelerator cards and rack systems, both built upon the cutting - edge AI200 and AI250 chips.

The AI200 is specifically designed to minimize the total cost of ownership. A single card of this model can support a massive 768GB of LPDDR memory, making it highly efficient in optimizing inference processes for large language models and multimodal models.

On the other hand, the AI250 embraces a near - memory computing architecture. This innovative design enables it to deliver memory bandwidth that is over ten times greater than conventional solutions, all while consuming less power.

Both the AI200 and AI250 come fully equipped with advanced features. They boast direct liquid cooling systems, which enhance thermal management and efficiency. Additionally, they offer PCIe/Ethernet expansion options, allowing for seamless integration and scalability in various data center environments. Moreover, they are equipped with confidential computing capabilities, ensuring the security and privacy of sensitive data. In terms of power consumption, a single rack of these systems consumes 160kW.

Looking ahead, these products are anticipated to hit the commercial market in 2026 and 2027, respectively, bringing a new wave of innovation to the data center industry.