On June 30, Baidu announced the official open sourcing of the Wenxin Large Model 4.5 series, comprising 10 models, including Mixture of Experts (MoE) models with 47 billion and 3 billion activation parameters, as well as a dense model with 0.3 billion parameters. The pre-trained weights and inference code have been made fully accessible to the public. Users can now download and deploy these models on platforms such as PaddlePaddle's Xinghe community and HuggingFace, while also leveraging their API services on Baidu Intelligent Cloud's Qianfan Large Model Platform.