Ce que cette page répond
Llama 4 Maverick 17B Instruct (128E) is a high-capacity multimodal language model from Meta, built on a mixture-of-experts (MoE) architecture with 128 experts and 17 billion active parameters per forward...
- Meta · meta-llama/llama-4-maverick
- text+image->text · route modèle globale
- 1 048 576 context · 0,15 $US input