| Name | Modified | Size | Downloads / Week |
|---|---|---|---|
| Parent folder | |||
| ra_aid-0.29.1-py3-none-any.whl | 2025-04-24 | 1.0 MB | |
| ra_aid-0.29.1.tar.gz | 2025-04-24 | 13.8 MB | |
| README.md | 2025-04-24 | 1.6 kB | |
| Release v0.29.0 source code.tar.gz | 2025-04-24 | 13.7 MB | |
| Release v0.29.0 source code.zip | 2025-04-24 | 14.0 MB | |
| Totals: 5 Items | 42.5 MB | 0 | |
[0.29.0] 2025-04-24
Changed
- Frontend Port Configuration:
- Frontend development server port is now configurable via
VITE_FRONTEND_PORTenvironment variable (defaults to 5173) (frontend/web/vite.config.js). - Frontend now dynamically determines the backend port using
VITE_BACKEND_PORTin dev (default 1818) andwindow.location.portin production (frontend/common/src/store/clientConfigStore.ts).
- Frontend development server port is now configurable via
- Expert Model Temperature Handling: The backend (
ra_aid/llm.py) now checks if an expert model supports thetemperatureparameter before passing it, preventing errors with models like newer OpenAI versions that don't. It continues to setreasoning_effortto"high"where supported. - OpenAI Model Definitions: Updated definitions for
o4-miniando3inra_aid/models_params.pyto setsupports_temperature=Falseandsupports_reasoning_effort=True.
Added
- Frontend Development Documentation: Added instructions to
docs/docs/contributing.mdon running the frontend dev server and configuring ports using environment variables. - New OpenAI Model Definitions: Added definitions for
o4-mini-2025-04-16,o3-2025-04-16, ando3-mini-2025-01-31tora_aid/models_params.py.
Fixed
- Custom Tool Result Handling: Ensured results from custom tools are always wrapped in a Langchain
BaseMessage(AIMessage) to maintain consistency (ra_aid/agent_backends/ciayn_agent.py). - Custom Tool Console Output: Corrected minor formatting issues (escaped newlines) in the console output message when executing custom tools (
ra_aid/agent_backends/ciayn_agent.py).