On January 14, 2026, ModelBest, in partnership with the Natural Language Processing Lab at Tsinghua University, Renmin University of China, and the OpenBMB Open-Source Community, formally unveiled its 4-billion-parameter agent model, named AgentCPM-Explore. This model has attained state-of-the-art (SOTA) performance in various benchmarks, including GAIA and HLE, when compared to models of a similar scale. It not only surpasses certain 8-billion-parameter models but also competes favorably with models boasting over 30 billion parameters and even with some closed-source counterparts. Specifically, in the Xbench-DeepResearch assessment, it outperformed both OpenAI-o3 and Claude-4.5-Sonnet. The model is now entirely open-source, with its source code accessible on GitHub. This openness supports full training reproducibility from the foundational model to the SOTA version and simplifies the deployment of edge-based agents for long-term tasks.
