On April 12th, MiniMax made a significant announcement regarding its M2.7 large-scale model, declaring its global open-source availability. The company has forged collaborations with both domestic and international chip giants, including Huawei Ascend, Moore Threads, MetaX, Kunlunxin, and NVIDIA. Additionally, it has partnered with prominent inference platforms such as Together AI, Fireworks, and Ollama. Remarkably, on the very first day of open-sourcing, the model seamlessly integrated and adapted to various inference environments, thereby propelling the growth and innovation of the global AI ecosystem.
