หน้านี้ตอบอะไร
Llama 4 Scout 17B Instruct (16E) is a mixture-of-experts (MoE) language model developed by Meta, activating 17 billion parameters out of a total of 109B. It supports native multimodal input...
- Meta · meta-llama/llama-4-scout
- text+image->text · เส้นทางโมเดลทั่วโลก
- 327,680 context · US$0.08 input