Ce que cette page répond
gpt-oss-120b is an open-weight, 117B-parameter Mixture-of-Experts (MoE) language model from OpenAI designed for high-reasoning, agentic, and general-purpose production use cases. It activates 5.1B parameters per forward
- OpenAI · openai/gpt-oss-120b
- text->text · route modèle globale
- 131 072 context · 0,039 $US input