RUN THIS LLM

Search local LLM hardware requirements

← Back to all models

GPT-OSS 120B

OpenAI · 117B MoE · General

OpenAI's first open-weight model. 117B MoE with only 5.1B active params. Apache 2.0. RL-trained reasoning with configurable thinking depth. 128K context.

VRAM Requirements

QuantizationVRAM
Q4_K_M (smallest)70.2 GB
Q8_0 (balanced)128.7 GB
FP16 (full quality)234 GB

Specifications

Benchmarks

Loading interactive analysis...

RTL
Add Run This LLM to your home screen
Get the Chrome Extension
Look up hardware requirements for any model right from your browser sidebar — no tab switching needed.
Add to Chrome — It's Free