The domestic large-scale model, MiniMax M2.5, has claimed the top spot globally in invocation volume for five consecutive weeks. At the company's headquarters, the product R&D team highlighted the significant price disparity between their model and its overseas counterparts. They pointed out that overseas models with comparable capabilities can be priced more than ten times higher, underscoring cost-effectiveness as the primary draw for global users to domestic large-scale models. This cost advantage arises from two key factors: Firstly, technological innovation has slashed inference costs through enhancements in the underlying architecture, allowing the same tasks to be accomplished with fewer Tokens. Secondly, energy efficiency plays a pivotal role—electricity expenses constitute 70% to 80% of computing power costs. For AI clusters that rely on large-scale parallel computing, variations in electricity prices can substantially affect annual operational expenses. China's reliable energy supply and comparatively low electricity costs foster a more competitive cost framework for the AI sector.
