Mistral: Ministral 3B
Mistral • text • function-calling • json-mode
mistralai/ministral-3bMinistral 3B is a 3B parameter model optimized for on-device and edge computing. It excels in knowledge, commonsense reasoning, and function-calling, outperforming larger models like Mistral 7B on most benchmarks. Supporting up to 128k context length, it’s ideal for orchestrating agentic workflows and specialist tasks with efficient inference.
Best For:
High-volume, low-latency tasks where cost efficiency is paramount
Pricing:
$0.00/1M input tokens, $0.00/1M output tokens
Context Window:
131,072 tokens
Key Differentiator:
Cost-optimized for high-volume usage