Huawei's Pangu Large Model Similarity to Alibaba's Qwen: Official Clarification
1 day ago / Read about 0 minute
Author:小编   

On June 30, 2025, Huawei made a significant move by officially open-sourcing its Pangu 7B dense model, the Pangu Pro MoE 72B mixture-of-experts model, along with model inference technology powered by Ascend. Shortly thereafter, a study published by @HonestAGI on GitHub raised eyebrows in the industry by highlighting a notable similarity in the parameter structure between Huawei's Pangu Pro MoE model and Alibaba's Tongyi Qianwen Qwen-2.5 14B model.

In response to the ensuing buzz, Huawei's Noah's Ark Lab issued a statement on July 5, clarifying that the Pangu Pro MoE open-source model is a foundational large model that was developed and trained exclusively on the Ascend hardware platform. The statement emphasized that the model was not incrementally trained or based on any pre-existing models from other vendors. Additionally, it acknowledged that the code implementation for certain fundamental components of the model drew inspiration from industry-standard open-source practices.