RUN THIS LLM
Search local LLM hardware requirements
DeepSeek V2.5
DeepSeek · 236B MoE · General
Unified chat and code model. Combines V2's capabilities with improved instruction following.
VRAM Requirements
| Quantization | VRAM |
|---|---|
| Q4_K_M (smallest) | 141.6 GB |
| Q8_0 (balanced) | 259.6 GB |
| FP16 (full quality) | 472 GB |
Specifications
- Parameters: 236B MoE (21B active per token)
- Category: General
- Max context: 128K tokens
- System RAM: 192 GB minimum
- HuggingFace: deepseek-ai/DeepSeek-V2.5
Benchmarks
- MMLU: 78 — General knowledge & reasoning (5-shot)
- HumanEval: 72 — Code generation (pass@1)
Loading interactive analysis...