Xiaohongshu Releases Open Source Large Text Model: dots.llm1
1 week ago / Read about 0 minute
Author:小编   

Xiaohongshu's hi lab team has proudly open-sourced their inaugural large text model, named dots.llm1. This model represents a significant advancement as a large-scale Mixture of Experts (MoE) language model, boasting an impressive 142 billion parameters, with 14 billion of these actively engaged. Having undergone rigorous training on an extensive dataset comprising 11.2 trillion high-quality records, dots.llm1 exhibits performance on par with Qwen2.5-72B, marking a significant milestone in the field.