On April 29, Alibaba's Tongyi Qianwen project launched eight new iterations of the Qwen3 series of hybrid reasoning models, making them openly available to the public. Among the models released are two Mixture-of-Experts (MoE) versions: Qwen3-235B-A22B, boasting over 235 billion total parameters and more than 22 billion active parameters, and Qwen3-30B-A3B, with a total of 30 billion parameters and 3 billion active parameters. Additionally, six Dense models were introduced: Qwen3-32B, Qwen3-14B, Qwen3-8B, Qwen3-4B, Qwen3-1.7B, and Qwen3-0.6B. These models represent significant advancements in both performance and cost efficiency, signaling a pivotal achievement for Alibaba in the realm of open-source modeling.
