Recently, a research team from the Department of Computer Science and Technology at Tsinghua University, led by Professor Sun Maosong, and including Associate Professor Liu Zhiyuan and Assistant Researcher Han Xu, in conjunction with the OpenBMB open-source community, put forward the 'Density Law' for large models. According to this law, from February 2023 to April 2025, the peak 'capability density' of large models is expected to roughly double every 3.5 months. In simpler terms, every 3.5 months, a model with only half the parameters of its predecessor can achieve the same level of optimal performance. To validate this, the research team crafted a relative 'capability density' evaluation framework. Upon analyzing 51 open-source large models, they discovered that the capability density of these models escalates exponentially over time. Drawing on the 'Density Law,' the team made several insightful predictions, including a reduction in inference costs for models with comparable capabilities. Additionally, the team has introduced high 'capability density' models tailored for edge devices, exemplified by MiniCPM, with corresponding research findings published in esteemed journals such as Nature Machine Intelligence.
