Shisa V2 Llama 3.3 70B (free) 查看AI模型的詳細資訊和價格

上下文 長度 32,768 代幣, shisa-ai 來自 提供

32,768
上下文權杖
免費
提示價格
免費
輸出價格
9/16
功能支援

模型介紹

Shisa V2 Llama 3.3 70B is a bilingual Japanese-English chat model fine-tuned by Shisa.AI on Meta’s Llama-3.3-70B-Instruct base. It prioritizes Japanese language performance while retaining strong English capabilities. The model was optimized entirely through post-training, using a refined mix of supervised fine-tuning (SFT) and DPO datasets including regenerated ShareGPT-style data, translation tasks, roleplaying conversations, and instruction-following prompts. Unlike earlier Shisa releases, this version avoids tokenizer modifications or extended pretraining. Shisa V2 70B achieves leading Japanese task performance across a wide range of custom and public benchmarks, including JA MT Bench, ELYZA 100, and Rakuda. It supports a 128K token context length and integrates smoothly with inference frameworks like vLLM and SGLang. While it inherits safety characteristics from its base model, no additional alignment was applied. The model is intended for high-performance bilingual chat, instruction following, and translation tasks across JA/EN.

基本資訊

開發商
shisa-ai
模型系列
Llama3
發布日期
2025-04-15
上下文長度
32,768 令牌
變體
free

價格資訊

此模型可免費使用

資料政策

使用條款

학습 정책

1

支援功能

支援 (9)

Top K
種子
頻率懲罰
存在懲罰
重複懲罰
Min P
Logit偏置
Logprobs
Top Logprobs

不支援 (7)

圖像輸入
回應格式
工具使用
結構化輸出
推理
網路搜尋選項
Top A

實際使用量統計

#203
總共 346 個模型中
294.03M
最近30天總Token數
9.80M
日均使用量
21%
週使用量變化

最近30天使用量趨勢