Was diese Seite klärt
GLM-4.5 is our latest flagship foundation model, purpose-built for agent-based applications. It leverages a Mixture-of-Experts (MoE) architecture and supports a context length of up to 128k tokens. GLM-4.5 delivers signi
- Zhipu AI (GLM) · z-ai/glm-4.5
- text->text · China-Modellroute
- 131.072 context · 0,60 $ input