Mixtral 8x7B Instruct Check detailed information and pricing for AI models

Context Length 32,768 tokens, mistralai from provided

32,768
Context Tokens
$0.40
Prompt Price
$0.40
Output Price
8/16
Feature Support

Model Overview

Mixtral 8x7B Instruct is a pretrained generative Sparse Mixture of Experts, by Mistral AI, for chat and instruction use. Incorporates 8 experts (feed-forward networks) for a total of 47 billion parameters. Instruct model fine-tuned by Mistral. #moe

Basic Information

Developer
mistralai
Model Series
Mistral
Release Date
2023-12-10
Context Length
32,768 tokens
Max Completion Tokens
16,384 tokens
Variant
standard

Pricing Information

Prompt Tokens
$0.40 / 1M tokens
Completion Tokens
$0.40 / 1M tokens

Supported Features

Supported (8)

Top K
Seed
Frequency Penalty
Presence Penalty
Repetition Penalty
Response Format
Min P
Tool Usage

Unsupported (8)

Image Input
Logit Bias
Logprobs
Top Logprobs
Structured Outputs
Reasoning
Web Search Options
Top A

Actual Usage Statistics

#94
Out of 353 total models
7.9B
Total Tokens Last 30 Days
262.69M
Daily Average Usage
9%
Weekly Usage Change

Usage Trend for the Last 30 Days

Models by Same Author (mistralai)

Mistral Medium 3.1
131,072 tokens
$0.40 / $2.00
Codestral 2508
256,000 tokens
$0.30 / $0.90
Devstral Medium
131,072 tokens
$0.40 / $2.00
Devstral Small 1.1
128,000 tokens
$0.07 / $0.28
Mistral Small 3.2 24B (free)
131,072 tokens
Free

Similar Price Range Models

UnslopNemo 12B
thedrummer
32,768 tokens
$0.40 / $0.40
Llama 3.2 90B Vision Instruct
meta-llama
32,768 tokens
$0.35 / $0.40
Llama 3 70B Instruct
meta-llama
8,192 tokens
$0.30 / $0.40
WizardLM-2 8x22B
microsoft
65,536 tokens
$0.48 / $0.48
Grok 3 Mini Beta
x-ai
131,072 tokens
$0.30 / $0.50