Last updated: 16/04/2025

MistralOfficial Docs

Open Mixtral 8x22B

open-mixtral-8x22b

deprecated

Open Mixtral 8x22B

Mistral AI's open-source Mixtral 8x22B model is a powerful sparse mixture-of-experts model with a 65,536 token context length, capable of handling a wide range of tasks including text, image, video, audio, transcription, and text-to-speech. This model is being deprecated and will be retired on March 30, 2025, with the recommended alternative being the mistral-small-latest model.

Supports a 65,536 token context window. Handles Text, Image, Video, Audio, Transcription, Text-to-Speech inputs and outputs. Supports fine-tuning for custom applications.

Additional Information

Notes

This model is being deprecated and will be retired on March 30, 2025. The recommended alternative is mistral-small-latest.

Model Timeline

Launch Date

4/1/2024

Marked Legacy

11/25/2024

Marked Deprecated

11/30/2024

Marked Expired

3/30/2025

Capabilities

Text

Input Pricing

$-/ KTok

Context: 65,536 tokens

Output Pricing

$-/ KTok

Embeddings

Embeddings Pricing

$0.0001/1k tokens

Additional Model Information

Tool Use

No

Structured Output

No

Reasoning

Yes

Flatten your repo for AI in seconds

Flatten repos. Prompt faster. One click → one GPT-ready file

Free Online & Desktop