Dolphin 2.6 Mixtral 8x7B 🐬 Check detailed information and pricing for AI models

Context Length 32,768 tokens, cognitivecomputations from provided

32,768
Context Tokens
$0.00
Prompt Price
$0.00
Output Price
0/16
Feature Support

Model Overview

This is a 16k context fine-tune of [Mixtral-8x7b](/models/mistralai/mixtral-8x7b). It excels in coding tasks due to extensive training with coding data and is known for its obedience, although it lacks DPO tuning. The model is uncensored and is stripped of alignment and bias. It requires an external alignment layer for ethical use. Users are cautioned to use this highly compliant model responsibly, as detailed in a blog post about uncensored models at [erichartford.com/uncensored-models](https://erichartford.com/uncensored-models). #moe #uncensored

Basic Information

Developer
cognitivecomputations
Model Series
Mistral
Release Date
2023-12-21
Context Length
32,768 tokens
Variant
standard

Pricing Information

Prompt Tokens
$0.00 / 1M tokens
Completion Tokens
$0.00 / 1M tokens

Supported Features

Unsupported (16)

Image Input
Top K
Seed
Frequency Penalty
Presence Penalty
Repetition Penalty
Response Format
Min P
Logit Bias
Tool Usage
Logprobs
Top Logprobs
Structured Outputs
Reasoning
Web Search Options
Top A

Actual Usage Statistics

No recent usage data available.

Models by Same Author (cognitivecomputations)

Uncensored (free)
32,768 tokens
Free
Dolphin3.0 R1 Mistral 24B
32,768 tokens
$0.01 / $0.03
Dolphin3.0 R1 Mistral 24B (free)
32,768 tokens
Free
Dolphin3.0 Mistral 24B
32,768 tokens
$0.03 / $0.11
Dolphin3.0 Mistral 24B (free)
32,768 tokens
Free

Similar Price Range Models

Jamba 1.5 Large
ai21
256,000 tokens
$0.00 / $0.00
R1 Distill Qwen 7B
deepseek
131,072 tokens
$0.00 / $0.00
Deepseek R1 0528 Qwen3 8B (free)
deepseek
131,072 tokens
$0.00 / $0.00
Gemma 1 2B
google
8,192 tokens
$0.00 / $0.00
R1 0528 (free)
deepseek
163,840 tokens
$0.00 / $0.00