ماذا تجيب هذه الصفحة
A 12B parameter model with a 128k token context length built by Mistral in collaboration with NVIDIA. The model is multilingual, supporting English, French, German, Spanish, Italian, Portuguese, Chinese, Japanese,...
- Mistral AI · mistralai/mistral-nemo
- text->text · مسار نموذج عالمي
- 131,072 context · 0.02 US$ input