A research team from Tsinghua University has put forward the 'Density Law,' a concept that has made it onto the cover of Nature Machine Intelligence, unveiling a fresh trajectory for the advancement of large - scale models.
The research reveals that the capability density of large models, which refers to the intelligence level per unit of parameters, has been undergoing exponential growth. From February 2023 to April 2025, it doubled every 3.5 months. This indicates that enhancing model performance no longer hinges on simply increasing the number of parameters. Instead, it relies on achieving efficient development by optimizing algorithms, improving data utilization efficiency, and refining training techniques.
After analyzing 51 open - source models, the research team discovered that, under the influence of the Density Law, the parameter count and inference costs of models with equivalent performance have kept decreasing. Take, for instance, the API price of GPT - 3.5 - level models, which has plummeted by 266.7 times over a span of 20 months.
Guided by this theory, the team has developed high - density models suitable for edge devices, such as MiniCPM. These models have found applications in various fields, including mobile phones, automobiles, and smart homes. As of October 2025, these models have been downloaded close to 15 million times across the globe.
This discovery signals the imminent arrival of the edge intelligence era. With concurrent improvements in chip computing power and model density, AI is set to transition from the cloud to end - user devices, driving its widespread adoption.
