这页解决什么
Qwen3-235B-A22B is a 235B parameter mixture-of-experts (MoE) model developed by Qwen, activating 22B parameters per forward pass. It supports seamless switching between a "thinking" mode for complex reasoning, math, and.
- 阿里云 · 通义 · qwen/qwen3-235b-a22b
- text->text · 中国模型路线
- 131,072 context · US$0.455 input