RUN THIS LLM
Search local LLM hardware requirements
DeepSeek V3
DeepSeek · 671B MoE · General
DeepSeek's flagship general model. 671B MoE with strong performance across all tasks.
VRAM Requirements
| Quantization | VRAM |
|---|---|
| Q4_K_M (smallest) | 402.6 GB |
| Q8_0 (balanced) | 738.1 GB |
| FP16 (full quality) | 1342 GB |
Specifications
- Parameters: 671B MoE (37B active per token)
- Category: General
- Max context: 128K tokens
- System RAM: 512 GB minimum
- HuggingFace: deepseek-ai/DeepSeek-V3
Benchmarks
- MMLU: 88 — General knowledge & reasoning (5-shot)
- HumanEval: 82 — Code generation (pass@1)
- MATH: 74 — Competition-level math reasoning
Loading interactive analysis...