Name | Modified | Size | Downloads / Week |
---|---|---|---|
Parent folder | |||
README.md | 2025-02-03 | 1.0 kB | |
v0.6.0 source code.tar.gz | 2025-02-03 | 2.7 MB | |
v0.6.0 source code.zip | 2025-02-03 | 3.1 MB | |
Totals: 3 Items | 5.8 MB | 2 |
What's Changed
-
Update cost calculations for cached input tokens
-
Add OpenAI o1 and o3-mini
-
Add DeepSeek R1 for multiple providers
-
Add CePO (Cerebras Planning and Optimization) multi-agent implementation of the LLM interface
-
Cache agents and chats in the UI
-
Add temperature, topK etc to chat UI
-
Add chat keyboard shortcuts
-
Update Perplexity LLM and tool to use the sonar models
-
Add LLM maxRetries opt
-
Update project detection to use the filesystem tree to handle larger projects
-
Update listing of files to account for .gitignore in parent directories
-
Add Multi-agent debate LLM implementation with reasoning models
-
Create fastLlama70b.ts LLM implementation
-
Update GitLab merge request review
-
Various other fixes/updates
New Contributors
- @shaneholloman made their first contribution in https://github.com/TrafficGuard/sophia/pull/45
Full Changelog: https://github.com/TrafficGuard/sophia/compare/v0.5.0...v0.6.0