Ariel Frischer
e6ba8f5dff
Merge pull request #95 from ariel-frischer/test-cmd-args
...
feat: add `--test-cmd-timeout` option to specify timeout for test command
2025-02-17 15:25:09 -08:00
AI Christianson
7c3c616531
use base latency in programmer tool
2025-02-17 18:21:49 -05:00
Ariel Frischer
685d098f21
feat(docusaurus): add Vercel analytics plugin to enhance performance tracking
...
feat(models_params.py): add DEFAULT_BASE_LATENCY constant and integrate it into model parameters for latency management
fix(package.json): add @docusaurus/plugin-vercel-analytics dependency to support new analytics feature
2025-02-17 15:19:01 -08:00
AI Christianson
e8e4dac038
add base latency model param
2025-02-17 18:11:33 -05:00
Ariel Frischer
581dc4b761
feat: add `--test-cmd-timeout` option to specify timeout for test command execution
...
This change introduces a new command-line option `--test-cmd-timeout` to allow users to set a timeout for the execution of test commands. The default timeout is set to 300 seconds. This enhancement provides users with more control over the execution time of their test commands, helping to prevent indefinite hangs during testing.
Additionally, the codebase has been updated to utilize this new timeout setting in relevant areas, ensuring consistent behavior across the application.
2025-02-17 11:03:19 -08:00
AI Christianson
66aa13f6ee
Version bump.
2025-02-13 09:50:00 -05:00
AI Christianson
93a3f8ccd7
remove reference to write_file_tool
2025-02-13 09:47:19 -05:00
AI Christianson
1084faaf0b
normalize/dedupe files for programmer tool
2025-02-13 09:45:32 -05:00
AI Christianson
972ea5284a
be more conservative with external process i/o to prevent hung situations
2025-02-13 09:00:29 -05:00
AI Christianson
09e30f2a24
validate expected_runtime_seconds
2025-02-13 08:37:43 -05:00
AI Christianson
098f4a9c53
integrate expected_runtime_seconds with run_shell_command
2025-02-13 08:34:38 -05:00
AI Christianson
abfa5a1d6a
add expected_runtime_seconds and shutdown processes w/ grace period that run too long.
2025-02-13 08:29:17 -05:00
AI Christianson
c5c27c9f87
enforce byte limit in interactive commands
2025-02-13 08:03:16 -05:00
AI Christianson
e6b34e4ebc
normalize related file paths
2025-02-12 22:12:55 -05:00
Jose M Leon
10ad8aa8d5
FIX aider flags ( #89 )
2025-02-12 21:46:14 -05:00
Jose M Leon
9bd0edfd10
FEAT print config at the beginning ( #88 )
2025-02-12 18:52:24 -05:00
AI Christianson
23dfc024e3
Version bump.
2025-02-12 18:09:38 -05:00
AI Christianson
e3d0de332f
use parent tty width
2025-02-12 18:07:38 -05:00
AI Christianson
3d3b3cfba4
set env vars to disable common interactive modes
2025-02-12 18:04:27 -05:00
AI Christianson
18b0ce230e
handle raw input in interactive
2025-02-12 17:59:34 -05:00
AI Christianson
7bae09e829
add pyte; improve status panel output
2025-02-12 17:21:03 -05:00
AI Christianson
1fbaeac308
use aider from the current env
2025-02-12 17:17:30 -05:00
AI Christianson
4d14b9747f
fix interactive command input
2025-02-12 17:08:37 -05:00
AI Christianson
0c8a4009dc
fix bug where completion message was wiped too early
2025-02-12 16:58:39 -05:00
AI Christianson
a169ed8517
disable put_complete_file_contents; improve prompts; improve status panel output
2025-02-12 16:20:24 -05:00
AI Christianson
905ed2c8fc
improve expert model auto detection
2025-02-12 15:55:47 -05:00
AI Christianson
e3a705eb9b
auto detect openai expert models
2025-02-12 15:40:21 -05:00
AI Christianson
94f0d96654
lower interactive shell history to conserve context; improve prompts
2025-02-12 15:26:36 -05:00
AI Christianson
6f9ed9562d
increase history; fix test
2025-02-12 15:01:02 -05:00
AI Christianson
dc079c5d0e
improve interactive tty process capture
2025-02-12 14:58:58 -05:00
AI Christianson
7598d42cf9
set temperature param on all initialize_llm calls
2025-02-12 14:15:05 -05:00
AI Christianson
a1371fc7e0
support default temp on a per-model basis; show status panel
2025-02-12 13:38:52 -05:00
AI Christianson
264f5025ed
Refactor write file tool so it is easier for LLMs to use properly.
2025-02-12 11:51:19 -05:00
AI Christianson
149e8e2251
set timeouts on llm clients
2025-02-10 11:41:27 -05:00
Jose M Leon
00a455d586
FIX do not default to o1 model ( #82 )
2025-02-08 20:28:10 -05:00
AI Christianson
0c86900ce4
Reduce tool count to make tool calling more reliable.
2025-02-08 18:26:08 -05:00
AI Christianson
13016278e5
prompt improvements
2025-02-08 16:10:24 -05:00
AI Christianson
4c0c2e2ccf
prompt improvements
2025-02-08 15:54:18 -05:00
AI Christianson
f40e11ee21
improve work logging; use reasoning_effort=high for openai expert models; improve prompts
2025-02-08 14:36:08 -05:00
AI Christianson
5fad3fc755
make cwd/current date available to more agents
2025-02-08 13:58:16 -05:00
AI Christianson
ea992960c1
prompt improvements
2025-02-08 13:36:30 -05:00
AI Christianson
c27a75bc26
get rid of pointless truncation message
2025-02-08 12:44:55 -05:00
AI Christianson
5861f3a2bf
Adjust token/bytes ratio to resolve errors on swebench-lite.
2025-02-08 08:07:37 -05:00
AI Christianson
0270a9a349
Version bump.
2025-02-02 18:54:27 -05:00
AI Christianson
53ccc46392
Add model params for o1 and o3-mini.
2025-02-02 18:48:41 -05:00
Ariel Frischer
1cc5b8e16c
refactor: clean up imports and improve code formatting for better readability
...
fix: ensure proper error handling and logging in WebSocket connections
feat: enhance WebSocket server to handle streaming and error messages more effectively
chore: update model parameters and configurations for better performance and maintainability
test: improve test coverage for model token limits and agent creation logic
2025-02-01 13:00:15 -08:00
Ariel Frischer
c2ba638a95
feat(agent_utils.py): enhance get_model_token_limit to support agent types for better configuration management
...
test(agent_utils.py): add tests for get_model_token_limit with different agent types to ensure correct functionality
2025-02-01 12:55:36 -08:00
AI Christianson
dd8b9c0d30
Improve prompts.
2025-01-31 17:45:47 -05:00
AI Christianson
e95c13a6d0
Fix temperature param handling.
2025-01-31 17:10:49 -05:00
AI Christianson
e3414890ff
Add logic to handle whether a model supports temperature or not.
2025-01-31 17:04:52 -05:00