这页解决什么
A 12B parameter model with a 128k token context length built by Mistral in collaboration with NVIDIA. The model is multilingual, supporting English, French, German, Spanish, Italian, Portuguese, Chinese, Japanese,...
- Mistral AI · mistralai/mistral-nemo
- text->text · 全球模型路线
- 131,072 context · US$0.02 input