Last updated: 16/04/2025

MistralOfficial Docs

Open Mixtral 8x7B

open-mixtral-8x7b

deprecated

Open Mixtral 8x7B

Open Mixtral 8x7B is a sparse mixture-of-experts model released by Mistral AI in December 2023. It has a large 32K token context window and supports a wide range of capabilities including text, image, video, audio, transcription, and text-to-speech.

Supports a 32K token context window. Handles Text, Image, Video, Audio, Transcription, Text-to-Speech inputs and outputs. Supports fine-tuning for custom applications. Supports tool use for advanced automation. Capable of generating structured output formats.

This legacy model will be deprecated soon, with the newer Mistral Small v3.1 model as a recommended alternative.

Additional Information

Notes

This model has aliases: mistral-small, mistral-small-2312. It has a context length of 32768 tokens.

Model Timeline

Launch Date

12/1/2023

Marked Legacy

11/25/2024

Marked Deprecated

11/30/2024

Marked Expired

3/30/2025

Capabilities

Text

Input Pricing

$-/ KTok

Context: 32,768 tokens

Output Pricing

$-/ KTok

Image

Input Pricing

Per image pricing not available

Embeddings

Embeddings Pricing

$0.0001/1k tokens

Additional Model Information

Tool Use

Yes

Structured Output

Yes

Reasoning

Yes

Flatten your repo for AI in seconds

Flatten repos. Prompt faster. One click → one GPT-ready file

Free Online & Desktop