Was diese Seite klärt
A 12B parameter model with a 128k token context length built by Mistral in collaboration with NVIDIA. The model is multilingual, supporting English, French, German, Spanish, Italian, Portuguese, Chinese, Japanese,...
- Mistral AI · mistralai/mistral-nemo
- text->text · globale Modellroute
- 131.072 context · 0,02 $ input