The International Conference on Learning Representations (ICLR) stands as one of the leading international academic conferences in the realms of machine learning and artificial intelligence. It zeroes in on the latest breakthroughs in deep learning, representation learning, and their cross-disciplinary applications. ICLR 2026 took place in Rio de Janeiro, Brazil, spanning from April 23 to 27, 2026, and drew a vast array of scholars from across the globe. The conference saw close to 19,000 submissions, with an acceptance rate hovering around 28.18%—a record low in recent times. Throughout the event, a multitude of cutting-edge research accomplishments were on display. Notably, 12 papers from the faculty and students of the School of Artificial Intelligence were accepted. Furthermore, a paper from the team led by Professor Kang Chongqing and Associate Professor Zhang Ning of the Department of Electrical Engineering made history by becoming the department's first-ever inclusion in ICLR. Additionally, Google showcased its newest TurboQuant compression algorithm, which is engineered to minimize the memory usage of key-value caches in large language models.
