ByteDance's Seed team has introduced an experimental diffusion language model named Seed Diffusion Preview. This model boasts an impressive inference speed of up to 2,146 tokens per second, marking a 5.4-fold increase compared to autoregressive models of similar scale. By employing advanced techniques such as two-stage curriculum learning and constrained sequential diffusion, Seed Diffusion Preview achieves comparable performance in code generation tasks while significantly enhancing both its inference speed and global control capabilities.