RA.Aid/ra_aid
Ariel Frischer 5240fb2617
feat(agent_utils.py): integrate litellm to retrieve model token limits for better flexibility (#51)
fix(agent_utils.py): rename max_tokens to max_input_tokens for clarity in state_modifier function
fix(models_tokens.py): update deepseek-reasoner token limit to 64000 for accuracy
test(agent_utils.py): add tests for litellm integration and fallback logic in get_model_token_limit function
2025-01-23 14:48:30 -05:00
..
agents Set Default Max Token Limit with Provider/Model Dictionary and Limit Tokens for Anthropic Claude React Agent (#45) 2025-01-20 14:41:29 -05:00
chat_models Add Deepseek Provider Support and Custom Deepseek Reasoner Chat Model (#50) 2025-01-22 07:21:10 -05:00
console Improve agent interruption UX by allowing user to specify feedback or exit the program entirely. 2024-12-23 14:05:59 -05:00
proc macOS compatibility (#23) 2024-12-30 12:38:06 -05:00
tests FEAT fix command line args and env var dependencies on anthropic (#21) 2024-12-28 16:53:57 -05:00
text Initial commit 2024-12-10 19:01:20 -05:00
tools Add Deepseek Provider Support and Custom Deepseek Reasoner Chat Model (#50) 2025-01-22 07:21:10 -05:00
__init__.py Improve agent interruption UX by allowing user to specify feedback or exit the program entirely. 2024-12-23 14:05:59 -05:00
__main__.py Add Deepseek Provider Support and Custom Deepseek Reasoner Chat Model (#50) 2025-01-22 07:21:10 -05:00
__version__.py Bump version 2025-01-22 11:10:47 -05:00
agent_utils.py feat(agent_utils.py): integrate litellm to retrieve model token limits for better flexibility (#51) 2025-01-23 14:48:30 -05:00
config.py Add configurable --recursion-limit argument (#46) 2025-01-21 11:46:56 -05:00
dependencies.py FEAT add function to check for dependencies on startup (#27) 2025-01-01 09:35:41 -05:00
env.py Add Deepseek Provider Support and Custom Deepseek Reasoner Chat Model (#50) 2025-01-22 07:21:10 -05:00
exceptions.py ciayn 2024-12-28 14:41:39 -05:00
file_listing.py Get project info programmatically to save tokens. 2025-01-09 14:43:42 -05:00
llm.py Add Deepseek Provider Support and Custom Deepseek Reasoner Chat Model (#50) 2025-01-22 07:21:10 -05:00
logging_config.py FEAT add verbose logging 2024-12-26 00:45:57 +00:00
models_tokens.py feat(agent_utils.py): integrate litellm to retrieve model token limits for better flexibility (#51) 2025-01-23 14:48:30 -05:00
project_info.py Integrate project info into chat prompt. 2025-01-09 15:01:39 -05:00
project_state.py Fix test_file_as_directory. 2025-01-09 15:04:21 -05:00
prompts.py Increase recusion limit; Handle 429 better; Improve prompts. 2025-01-09 15:52:51 -05:00
provider_strategy.py Add Deepseek Provider Support and Custom Deepseek Reasoner Chat Model (#50) 2025-01-22 07:21:10 -05:00
tool_configs.py Integrate project info into chat prompt. 2025-01-09 15:01:39 -05:00