Hermes 2 Mixtral 8x7B DPO Check detailed information and pricing for AI models

Context Length 32,768 tokens, nousresearch from provided

32,768
Context Tokens
$0.60
Prompt Price
$0.60
Output Price
7/16
Feature Support

Model Overview

Nous Hermes 2 Mixtral 8x7B DPO is the new flagship Nous Research model trained over the [Mixtral 8x7B MoE LLM](/models/mistralai/mixtral-8x7b). The model was trained on over 1,000,000 entries of primarily [GPT-4](/models/openai/gpt-4) generated data, as well as other high quality data from open datasets across the AI landscape, achieving state of the art performance on a variety of tasks. #moe

Basic Information

Developer
nousresearch
Model Series
Mistral
Release Date
2024-01-16
Context Length
32,768 tokens
Max Completion Tokens
2,048 tokens
Variant
standard

Pricing Information

Prompt Tokens
$0.60 / 1M tokens
Completion Tokens
$0.60 / 1M tokens

Supported Features

Supported (7)

Top K
Frequency Penalty
Presence Penalty
Repetition Penalty
Response Format
Min P
Logit Bias

Unsupported (9)

Image Input
Seed
Tool Usage
Logprobs
Top Logprobs
Structured Outputs
Reasoning
Web Search Options
Top A

Actual Usage Statistics

#281
Out of 345 total models
41.44M
Total Tokens Last 30 Days
1.38M
Daily Average Usage
21%
Weekly Usage Change

Usage Trend for the Last 30 Days

Models by Same Author (nousresearch)

DeepHermes 3 Mistral 24B Preview (free)
32,768 tokens
Free
DeepHermes 3 Llama 3 8B Preview (free)
131,072 tokens
Free
Hermes 3 70B Instruct
131,072 tokens
$0.12 / $0.30
Hermes 3 405B Instruct
131,072 tokens
$0.70 / $0.80
Hermes 2 Theta 8B
16,384 tokens
$0.00 / $0.00

Similar Price Range Models

WizardLM-2 8x22B
microsoft
65,536 tokens
$0.48 / $0.48
UnslopNemo 12B
thedrummer
32,000 tokens
$0.45 / $0.45
Arcee Blitz
arcee-ai
32,768 tokens
$0.45 / $0.75