The DeepSeek V3.1 model has an issue when generating text, inserting tokens such as " extreme ", " the utmost point ", "extreme" at unexpected positions. This bug severely impacts programming and other structure-sensitive tasks. The problem is more common in third-party APIs, and although it also exists in the official web client's FP8 full-precision mode, the occurrence rate is lower. Community users speculate that this issue may be related to incomplete cleaning of training data or a shift in the model's decoding probability distribution, but some cases still cannot be explained by this theory.