
Huawei’s Noah’s Ark Lab asserted that the Pangu Pro Moe model was independently developed with original architectural innovations, trained entirely on its Ascend AI chips, and built using open-source resources in compliance with licenses, without relying on rival models
Huawei’s AI research arm, Noah’s Ark Lab, has dismissed allegations that its Pangu Pro Moe large language model (LLM) borrowed from Alibaba’s Qwen 2.5-14B model. The denial came in response to a report published on July 4 by an anonymous group named HonestAGI on GitHub, which claimed Huawei’s model exhibited unusually high correlation with Alibaba’s, suggesting potential reuse or "upcycling."
In a formal statement issued on July 5, Noah’s Ark Lab asserted that Pangu Pro Moe was independently developed and trained. It emphasized that the model was not created through incremental training on other companies’ models and claimed it features original architectural and technical innovations.
Huawei also noted that Pangu Pro Moe is the first large-scale model fully trained on its proprietary Ascend AI chips. While acknowledging the use of some open-source resources, the company stressed that its team complied with all relevant licensing requirements, though it did not disclose which models or codebases were referenced.
The HonestAGI paper, which rapidly gained attention across AI forums and Chinese tech media, accused Huawei of misleading technical reporting and inflating its R&D efforts. The authors alleged that Huawei's technical documentation might include fabricated claims and could potentially violate copyright laws. Attempts to contact HonestAGI for verification have been unsuccessful, and its organizational identity remains unclear.
China’s AI race intensifies
Alibaba, whose Qwen 2.5-14B model is designed for lightweight deployments on PCs and smartphones, has not yet commented on the matter.
Huawei’s Pangu models, first launched in 2021, are generally tailored for enterprise use in sectors such as government, finance, and manufacturing. The Pangu Pro Moe models were open-sourced in late June on Chinese platform GitCode, a move aimed at expanding developer engagement.
The broader context includes intensifying competition in China’s AI landscape. In early 2024, the release of DeepSeek’s low-cost open-source R1 model prompted tech giants to accelerate innovation and public access to their AI offerings.
While Huawei has historically lagged in mainstream AI adoption, it is now seeking to reassert itself by positioning Pangu as a scalable, enterprise-grade alternative in the competitive LLM market.
See What’s Next in Tech With the Fast Forward Newsletter
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.