Tencent Hunyuan Unveils Its Premier Open-Source Mixed Inference Model, Excelling in Agent Tool Invocation and Long Text Understanding
1 week ago / Read about 0 minute
Author:小编   

Tencent Hunyuan has officially open-sourced its groundbreaking mixed inference Mixture-of-Experts (MoE) model, Hunyuan-A13B. This model, boasting a total of 80 billion parameters but strategically activating only 13 billion, rivals the performance of industry-leading open-source models while delivering faster inference speeds and superior cost-effectiveness. Hunyuan-A13B is now accessible on open-source platforms like Github and Huggingface, and supports seamless deployment through the Tencent Cloud API. As the industry's first 13B-level MoE open-source mixed inference model, Hunyuan-A13B signifies a major technological advancement by Tencent in the realm of AI.