PanGu Team Addresses Plagiarism Claims: Model Independently Trained, Open-Source Code Properly Attributed
1 day ago / Read about 0 minute
Author:小编   

On the afternoon of July 5, the PanGu Pro MoE technology development team released a statement in response to recent discussions within the open-source community and on various online platforms concerning the open-source code of the PanGu large model. The statement clarifies that the PanGu Pro MoE open-source model was exclusively developed and trained for the Ascend hardware platform, without relying on models from other vendors for incremental training. This model boasts significant innovations in its architectural design and technical attributes, positioning it as the world's first mixed expert model of its kind tailored for Ascend hardware. The introduced Mixed Group of Experts (MoGE) architecture effectively addresses the load balancing challenges in large-scale distributed training, thereby markedly enhancing training efficiency.