หน้านี้ตอบอะไร
Llama 4 Maverick 17B Instruct (128E) is a high-capacity multimodal language model from Meta, built on a mixture-of-experts (MoE) architecture with 128 experts and 17 billion active parameters per forward...
- Meta · meta-llama/llama-4-maverick
- text+image->text · เส้นทางโมเดลทั่วโลก
- 1,048,576 context · US$0.15 input