Huawei Responds to Plagiarism Claims Over Pangu Model
Following accusations of plagiarism related to its Pangu large language model, **Huawei** has issued a statement defending its development process and intellectual property practices.
Denying Incremental Training
**Huawei’s** Noahโs Ark Laboratory stated that the Pangu Pro MoE open-source model was built and trained on the Shengteng hardware platform. The company explicitly denies basing the model on incremental training from other manufacturers’ existing models.
Addressing Code Similarities
Acknowledging that the Pangu Pro MoE model incorporates elements from open-source resources, the **Pangu** team emphasized adherence to open-source licensing. According to their statement, The code implementation of some basic components of Pangu Pro MoE open source model refers to the industry’s open source practice and involves some open source code of other open source models. We strictly follow the requirements of the open source license and clearly mark the copyright statement of the open source code in the open source code files.
Innovation in Architecture
According to **Huawei**, the Pangu Pro MoE model introduces key innovations, most notably the grouped hybrid expert model (MoGE) architecture, specifically designed for the Shengteng hardware platform. This architecture aims to improve training efficiency by solving load balancing problems during large-scale distributed training.
Approximately 26.5% of AI models are open source as of 2024, fostering collaboration and innovation in the field (Stanford HAI 2024 Report).
Huawei’s Stance on Open Source
The team further added, This is not only a common practice in the open source community, but also in line with the spirit of open source collaboration advocated by the industry.
**Huawei** asserts it respects third-party intellectual property rights and promotes inclusivity, fairness, openness, unity, and sustainability in open-source development.
Full Statement from Huawei
We have noticed the recent discussions on the open source code of Pangu big model by open source communities and online platforms.
Pangu Pro MoE open source model is a basic big model developed and trained by Shengteng hardware platform. It is not based on incremental training of other manufacturers’ models. It has made key innovations in architecture design, technical characteristics, etc. It is the world’s first mixed expert model with the same specifications designed for Shengteng hardware platform. It innovatively proposes a grouped hybrid expert model (MoGE) architecture, effectively solving the load balancing problem of large-scale distributed training and improving training efficiency. For other technological innovations, please refer to the disclosure of Shengteng Ecological Competitiveness Series Technical Reports.
The code implementation of some basic components of Pangu Pro MoE open source model refers to the industry’s open source practice and involves some open source code of other open source models. We strictly follow the requirements of the open source license and clearly mark the copyright notice of the open source code in the open source code file. This is not only a common practice in the open source community, but also in line with the spirit of open source collaboration advocated by the industry. We always adhere to openness and innovation, respect third-party intellectual property rights, and advocate the open source concept of inclusiveness, fairness, openness, unity and sustainability.
Thanks to developers and partners around the world for their attention and support for Panguโs big model, we attach great importance to the constructive opinions of the open source community. We hope to open source Panguโs big model, work with like-minded partners to explore and continuously optimize model capabilities, and accelerate technological breakthroughs and industrial implementation.
We welcome and look forward to everyone having in-depth and professional exchanges on technical details at the open source community Ascend Tribe
โPangu Pro MoE Technical Development Team
**Huawei** encourages developers and partners to engage in technical discussions within the Ascend Tribe open-source community to refine model capabilities and promote advancements.