Dolphin 2.6 Mixtral 8x7B 🐬 查看AI模型的详细信息和价格
上下文 长度 32,768 令牌, cognitivecomputations 来自 提供
32,768
上下文令牌
$0.00
提示价格
$0.00
输出价格
0/16
功能支持
模型介绍
This is a 16k context fine-tune of [Mixtral-8x7b](/models/mistralai/mixtral-8x7b). It excels in coding tasks due to extensive training with coding data and is known for its obedience, although it lacks DPO tuning. The model is uncensored and is stripped of alignment and bias. It requires an external alignment layer for ethical use. Users are cautioned to use this highly compliant model responsibly, as detailed in a blog post about uncensored models at [erichartford.com/uncensored-models](https://erichartford.com/uncensored-models). #moe #uncensored
基本信息
开发商
cognitivecomputations
模型系列
Mistral
发布日期
2023-12-21
上下文长度
32,768 令牌
变体
standard
价格信息
提示令牌
$0.00 / 1M 令牌
完成令牌
$0.00 / 1M 令牌
支持功能
不支持 (16)
图像输入
Top K
种子
频率惩罚
存在惩罚
重复惩罚
响应格式
Min P
Logit偏置
工具使用
Logprobs
Top Logprobs
结构化输出
推理
网络搜索选项
Top A