Exclusive Launch of Tencent Hunyuan's Premier Open-Source Hybrid Inference MoE Model on ModelScope
1 week ago / Read about 0 minute
Author:小编   

On June 27th, Tencent Hunyuan's groundbreaking hybrid inference Mixture of Experts (MoE) model, Hunyuan-A13B, debuted exclusively on the ModelScope community. This model stands out with a total of 80 billion parameters, yet it intelligently activates only 13 billion parameters, marking it as the industry's pioneering open-source hybrid inference MoE model at this scale. ModelScope, China's preeminent open-source community, has garnered the exclusive releases of numerous leading open-source models from across the industry. As of June 2025, ModelScope boasts a diverse collection of over 70,000 models, spanning various domains including Large Language Models (LLMs), dialogue systems, speech recognition, text-to-image generation, image-to-video synthesis, AI music composition, and more. It supports more than 4,000 Model Conversion Pipeline (MCP) services and caters to over 16 million developers worldwide.