이 페이지가 답하는 것
gpt-oss-120b is an open-weight, 117B-parameter Mixture-of-Experts (MoE) language model from OpenAI designed for high-reasoning, agentic, and general-purpose production use cases. It activates 5.1B parameters per forward
- OpenAI · openai/gpt-oss-120b:free
- text->text · 글로벌 모델 경로
- 131,072 context · US$0.00 input