このページで分かること
Mistral Small 3 is a 24B-parameter language model optimized for low-latency performance across common AI tasks. Released under the Apache 2.0 license, it features both pre-trained and instruction-tuned versions designed.
- Mistral AI · mistralai/mistral-small-24b-instruct-2501
- text->text · グローバルモデルルート
- 32,768 context · $0.05 input