这页解决什么
GLM-4.5 is our latest flagship foundation model, purpose-built for agent-based applications. It leverages a Mixture-of-Experts (MoE) architecture and supports a context length of up to 128k tokens. GLM-4.5 delivers signi
- 智谱 AI · z-ai/glm-4.5
- text->text · 中国模型路线
- 131,072 context · US$0.60 input