MistralMixtralOpen SourceArena #238Dec 10, 2023
Mistral: Mixtral 8x7B Instruct
Mixtral 8x7B Instruct is a pretrained generative Sparse Mixture of Experts, by Mistral AI, for chat and instruction use. Incorporates 8 experts (feed-forward networks) for a total of 47 billion parameters. Instruct model fine-tuned by Mistral. #moe
Context Window
33K
tokens
Max Output
16K
tokens
Released
Dec 10, 2023
Arena Rank
#238
of 305 models
Capabilities
👁Vision
🧠Reasoning
🔧Tool Calling
⚡Prompt Caching
🖥Computer Use
🎨Image Generation
Supported Parameters
Frequency Penalty
Reduce repetition
Logit Bias
Adjust token weights
Max Tokens
Output length limit
min_p
Presence Penalty
Encourage new topics
Repetition Penalty
Penalize repeated tokens
Response Format
JSON mode / structured output
Seed
Deterministic outputs
Stop Sequences
Custom stop tokens
Temperature
Controls randomness
Tool Choice
Control tool usage
Tool Calling
Function calling support
Top K
Top-K sampling
Top P
Nucleus sampling
Pricing Comparison
| Router | Input / 1M | Output / 1M | Cached Input / 1M |
|---|---|---|---|
| OpenRouter | $0.54 | $0.54 | — |
| Martian | $0.54 | $0.54 | — |
Benchmarks
Open LLM Leaderboard
AverageOpen LLM Leaderboard
23.82/100IFEvalOpen LLM Leaderboard
55.99/100BBHOpen LLM Leaderboard
29.74/100MATH Lvl 5Open LLM Leaderboard
9.14/100GPQAOpen LLM Leaderboard
7.05/100MUSROpen LLM Leaderboard
11.07/100MMLU-PROOpen LLM Leaderboard
29.91/100Model IDs
OpenRouter
mistralai/mixtral-8x7b-instructHugging Facemistralai/Mixtral-8x7B-Instruct-v0.1 ↗
Tags
tool-calling
Related Models
Mistral#221
Mistral: Mixtral 8x22B Instruct
66K ctx$1.20/1M in
Mistral#48
Mistral Large 3
256K ctx$0.50/1M in
Mistral#163
Mistral Large 2407
131K ctx$2.00/1M in
Mistral#173
Mistral Large 2411
131K ctx$2.00/1M in
Mistral#174
Magistral Medium 2509
128K ctx$2.00/1M in
Mistral#175
Mistral: Mistral Small 3.1 24B (free)
128K ctxFree/1M in