On January 21st, data from Hugging Face, the world's foremost AI open-source community, revealed that the number of derivative models stemming from Ali QianWen has soared past the 200,000 mark. This milestone positions Ali QianWen as the inaugural open-source large model globally to attain such a feat. In parallel, the cumulative downloads of the QianWen series models have surged beyond 1 billion, averaging an impressive 1.1 million downloads daily. This performance not only eclipses that of the American Llama but also solidifies QianWen's standing as the premier choice among global open-source large models.
