RUN THIS LLM
Search local LLM hardware requirements
DeepSeek Coder V2
DeepSeek · 236B MoE · Code
Large MoE code model. Exceptional at code generation, debugging, and repository-level tasks.
VRAM Requirements
| Quantization | VRAM |
|---|---|
| Q4_K_M (smallest) | 141.6 GB |
| Q8_0 (balanced) | 259.6 GB |
| FP16 (full quality) | 472 GB |
Specifications
- Parameters: 236B MoE (21B active per token)
- Category: Code
- Max context: 128K tokens
- System RAM: 192 GB minimum
- HuggingFace: deepseek-ai/DeepSeek-Coder-V2-Instruct
Benchmarks
- HumanEval: 85 — Code generation (pass@1)
Loading interactive analysis...