dolphin-2.9.1-yi-1.5-34b is a 34B parameter large language model fine-tuned from Yi-1.5-34B by Cognitive Computations, with training led by Eric Hartford and collaborators. Built with Axolotl and using ChatML formatting, it supports 8k sequence lengths via RoPE theta scaling, surpassing its base 4k context limit. The model excels in instruction following, open-ended dialogue, coding, and early agentic behaviors, including function calling. It is intentionally uncensored and optimized for high compliance, with dataset filtering focused on removing alignment layers. While powerful and permissive, users are advised to implement their own safeguards before deploying it in public-facing applications.
Features
- Based on Yi-1.5-34B with full-weight fine-tuning
- Supports 8k token context with RoPE theta 1,000,000
- ChatML prompt format with system/user/assistant roles
- Handles conversation, coding, instructions, and functions
- Uncensored: filtered for minimal alignment and bias
- Trained on GPT-4–generated data and other datasets
- Apache 2.0 license allows commercial use
- Built using Axolotl on 8x H100 GPUs
Categories
AI ModelsFollow dolphin-2.9.1-yi-1.5-34b
Other Useful Business Software
Our Free Plans just got better! | Auth0
You asked, we delivered! Auth0 is excited to expand our Free and Paid plans to include more options so you can focus on building, deploying, and scaling applications without having to worry about your security. Auth0 now, thank yourself later.
Rate This Project
Login To Rate This Project
User Reviews
Be the first to post a review of dolphin-2.9.1-yi-1.5-34b!