Mixtral 8x7B Instruct Check detailed information and pricing for AI models
Context Length 32,768 tokens, mistralai from provided
32,768
Context Tokens
$0.08
Prompt Price
$0.24
Output Price
8/16
Feature Support
Model Overview
Mixtral 8x7B Instruct is a pretrained generative Sparse Mixture of Experts, by Mistral AI, for chat and instruction use. Incorporates 8 experts (feed-forward networks) for a total of 47 billion parameters. Instruct model fine-tuned by Mistral. #moe
Basic Information
Developer
mistralai
Model Series
Mistral
Release Date
2023-12-10
Context Length
32,768 tokens
Max Completion Tokens
16,384 tokens
Variant
standard
Pricing Information
Prompt Tokens
$0.08 / 1M tokens
Completion Tokens
$0.24 / 1M tokens
Data Policy
Supported Features
Supported (8)
Top K
Seed
Frequency Penalty
Presence Penalty
Repetition Penalty
Response Format
Min P
Tool Usage
Unsupported (8)
Image Input
Logit Bias
Logprobs
Top Logprobs
Structured Outputs
Reasoning
Web Search Options
Top A
Actual Usage Statistics
#68
Out of 346 total models
9.3B
Total Tokens Last 30 Days
308.80M
Daily Average Usage
39%
Weekly Usage Change