DeepSeek has introduced a new model, DeepSeek-Prover-V2-671B, to the AI open-source community on Hugging Face. This model employs the efficient safetensors file format and supports diverse computational precisions like BF16, FP8, and F32, significantly boosting training and deployment efficiency while conserving resources. Boasting 671 billion parameters, DeepSeek-Prover-V2-671B is presumably an advanced iteration of Prover-V1.5, leveraging the DeepSeek-V3 architecture with 61 Transformer layers, enabling it to tackle intricate mathematical proofs with ease.
