On October 4, Tongyi Qianwen from Alibaba Cloud made a significant announcement: the open-sourcing of the Qwen3-VL-30B-A3B-Instruct and Thinking models. Alongside this, they also launched the FP8 version of these models, as well as the FP8 version of their ultra-large-scale model, Qwen3-VL-235B-A22B.
Despite their relatively compact size, the Qwen3-VL-30B-A3B-Instruct and Thinking models pack a powerful punch. With just 3 billion active parameters, they are capable of matching or even outperforming models like GPT-5-Mini and Claude4-Sonnet across a range of domains. These domains include STEM (Science, Technology, Engineering, and Mathematics), visual question answering, optical character recognition, video understanding, and agent tasks.
What's more, these models are readily accessible. They can be downloaded free of charge from the ModelScope community and Hugging Face platforms. Additionally, they have been simultaneously made available on Qwen Chat, ensuring broad accessibility and ease of use for developers and researchers alike.