China Unveils Its Pioneering Brain-Inspired Spiking Large Model, 'Shunxi 1.0'
4 day ago / Read about 0 minute
Author:小编   

Recently, a collaborative effort between the research team of Li Guoqi and Xu Bo from the Institute of Automation at the Chinese Academy of Sciences and MetaX has borne fruitful results. They have successfully developed the brain-inspired spiking large model, named 'Shunxi 1.0' (SpikingBrain-1.0), and have accomplished the entire training and inference process on a domestic GPU computing platform equipped with thousands of cards.

Rooted in the 'endogenous complexity' theory and inspired by the operational mechanisms of brain neurons, this model marks a substantial leap forward in the efficiency and speed of large models, particularly in the realm of ultra-long sequence reasoning.

The research team has gone the extra mile by open-sourcing the SpikingBrain-1.0-7B model, setting up a test website for the SpikingBrain-1.0-76B variant, and publishing comprehensive technical reports in both Chinese and English.

This innovative model is capable of efficient training with minimal data volumes, dramatically boosts reasoning efficiency, and paves the way for a domestically controlled ecosystem for brain-inspired large models. It presents a fresh technical pathway, distinct from the Transformer architecture, for the advancement of next-generation artificial intelligence.

  • C114 Communication Network
  • Communication Home