Meta researchers have open-sourced AU-Net, an innovative autoregressive U-Net architecture that fundamentally transforms the tokenization and processing approach of conventional language models. By initiating learning directly from raw bytes, AU-Net dynamically integrates them into words, word pairs, and multi-word phrases, crafting multi-scale sequence representations. This advancement endows language models with unparalleled efficiency and flexibility in processing capabilities.