Based on the data sourced from the official HuggingFace website, as of September 11, 2025, Baidu's open - source Wenxin thinking model, namely ERNIE - 4.5 - 21B - A3B - Thinking, has claimed the top spot on HuggingFace's text model trends list and secured the third position in the overall model rankings.
This model adopts a Mixture of Experts (MoE) architecture. It boasts a total of 21 billion parameters, and for each token, it activates 3 billion parameters. Through instruction fine - tuning and reinforcement learning for training, it can support a 128K context window. This makes it well - suited for long - context and complex reasoning tasks.
The model shows remarkable enhancements in various tasks, including logical reasoning, mathematics, science, coding, and text generation. Moreover, it is equipped with efficient tool - calling capabilities.
The model is open - sourced under the Apache License 2.0, which permits its use for commercial purposes. It has been made available on platforms such as HuggingFace and Xinghe Community. Additionally, toolchains like FastDeploy, vLLM, and Transformers have already been adapted and are now supported.