Last updated: 16/04/2025

MistralOfficial Docs

Open Mixtral 8x22B

open-mixtral-8x22b-2404

deprecated

Open Mixtral 8x22B

Mistral AI's open-source Mixtral 8x22B model is a powerful sparse mixture-of-experts model with a 65,536 token context window, capable of handling a wide range of modalities including text, image, video, audio, transcription, and text-to-speech. This versatile model supports fine-tuning, allowing users to customize it for their specific needs.

Supports a 65,536 token context window. Handles Text, Image, Video, Audio, Transcription, Text-to-Speech inputs and outputs. Supports fine-tuning for custom applications.

Additional Information

Notes

This model is considered legacy as of November 25, 2024, will be deprecated on November 30, 2024, and will be retired on March 30, 2025. The recommended alternative is mistral-small-latest.

Model Timeline

Launch Date

4/1/2024

Marked Legacy

11/25/2024

Marked Deprecated

11/30/2024

Marked Expired

3/30/2025

Capabilities

Text

Input Pricing

$0.70/ MTok

Context: 65,536 tokens

Output Pricing

$0.70/ MTok

Max tokens: 4,096

Embeddings

Embeddings Pricing

$0.0001/1k tokens

Additional Model Information

Tool Use

No

Structured Output

No

Reasoning

Yes

Flatten your repo for AI in seconds

Flatten repos. Prompt faster. One click → one GPT-ready file

Free Online & Desktop