MistralMixtralOpen SourceArena #221Apr 17, 2024

Mistral: Mixtral 8x22B Instruct

Mistral's official instruct fine-tuned version of Mixtral 8x22B. It uses 39B active parameters out of 141B, offering unparalleled cost efficiency for its size. Its strengths include: - strong math, coding, and reasoning - large context length (64k) - fluency in English, French, Italian, German, and Spanish See benchmarks on the launch announcement here. #moe

Context Window
66K
tokens
Max Output
tokens
Released
Apr 17, 2024
Arena Rank
#221
of 305 models

Capabilities

👁Vision
🧠Reasoning
🔧Tool Calling
Prompt Caching
🖥Computer Use
🎨Image Generation

Supported Parameters

Frequency Penalty
Reduce repetition
Max Tokens
Output length limit
Presence Penalty
Encourage new topics
Response Format
JSON mode / structured output
Seed
Deterministic outputs
Stop Sequences
Custom stop tokens
structured_outputs
Temperature
Controls randomness
Tool Choice
Control tool usage
Tool Calling
Function calling support
Top P
Nucleus sampling

Pricing Comparison

RouterInput / 1MOutput / 1MCached Input / 1M
OpenRouter$2.00$6.00
Vercel AI$1.20$1.20
Martian$2.00$6.00

Benchmarks

Open LLM Leaderboard
AverageOpen LLM Leaderboard
33.89/100
IFEvalOpen LLM Leaderboard
71.84/100
BBHOpen LLM Leaderboard
44.11/100
MATH Lvl 5Open LLM Leaderboard
18.73/100
GPQAOpen LLM Leaderboard
16.44/100
MUSROpen LLM Leaderboard
13.49/100
MMLU-PROOpen LLM Leaderboard
38.7/100

Model IDs

OpenRoutermistralai/mixtral-8x22b-instruct

Tags

tool-calling
Compare with another model

Related Models