LeCun's Team Releases Open-Source Code for the Pioneering Code World Model: Adept at Code Generation, Self-Testing, and Self-Repair
2 day ago / Read about 0 minute
Author:小编   

The Code World Model (CWM), unveiled by Meta FAIR, stands as a sophisticated language model boasting 32 billion parameters and an impressive context length of up to 131,000 tokens. It has been meticulously crafted for the dual purposes of code generation and intricate reasoning tasks. Distinguished as the inaugural language model to systematically integrate world models into the realm of code generation, CWM employs a sandbox environment to simulate the code execution process. This innovative approach enables it to accurately predict the outcomes of code execution, thereby substantially curtailing potential errors and significantly boosting debugging efficiency.

The model is underpinned by a 64-layer decoder-only Transformer architecture, which not only supports long-context input but also showcases remarkable prowess in tackling general programming and mathematical challenges. In a move to foster research and collaboration, Meta has generously made available the model checkpoints from intermediate training phases, including Supervised Fine-Tuning (SFT) and Reinforcement Learning (RL) stages, for the benefit of researchers worldwide.