Shisa V2 Llama 3.3 70B (free) 查看AI模型的详细信息和价格

上下文 长度 32,768 令牌, shisa-ai 来自 提供

32,768
上下文令牌
免费
提示价格
免费
输出价格
9/16
功能支持

模型介绍

Shisa V2 Llama 3.3 70B is a bilingual Japanese-English chat model fine-tuned by Shisa.AI on Meta’s Llama-3.3-70B-Instruct base. It prioritizes Japanese language performance while retaining strong English capabilities. The model was optimized entirely through post-training, using a refined mix of supervised fine-tuning (SFT) and DPO datasets including regenerated ShareGPT-style data, translation tasks, roleplaying conversations, and instruction-following prompts. Unlike earlier Shisa releases, this version avoids tokenizer modifications or extended pretraining. Shisa V2 70B achieves leading Japanese task performance across a wide range of custom and public benchmarks, including JA MT Bench, ELYZA 100, and Rakuda. It supports a 128K token context length and integrates smoothly with inference frameworks like vLLM and SGLang. While it inherits safety characteristics from its base model, no additional alignment was applied. The model is intended for high-performance bilingual chat, instruction following, and translation tasks across JA/EN.

基本信息

开发商
shisa-ai
模型系列
Llama3
发布日期
2025-04-15
上下文长度
32,768 令牌
变体
free

价格信息

此模型可免费使用

数据政策

使用条款

학습 정책

1

支持功能

支持 (9)

Top K
种子
频率惩罚
存在惩罚
重复惩罚
Min P
Logit偏置
Logprobs
Top Logprobs

不支持 (7)

图像输入
响应格式
工具使用
结构化输出
推理
网络搜索选项
Top A

实际使用量统计

#203
总共 346 个模型中
294.03M
最近30天总Token数
9.80M
日均使用量
21%
周使用量变化

最近30天使用量趋势