I am getting stuck sometimes when working in agent mode with the following error:
agent> how are you?
Processing...
LiteLLM API Error: litellm.BadRequestError: OpenrouterException - {"error":{"message":"Provider returned error","code":400,"metadata":{"raw":"{\"object\":\"error\",\"message\":\"Assistant message must have either content or tool_calls, but not none.\",\"type\":\"invalid_request_assistant_message\",\"param\":null,\"code\":\"3240\"}","provider_name":"Mistral"}},"user_id":"user_<removed>"}
litellm.APIError: Error building chunks for logging/streaming usage calculation
Further observations
Still assumptions at the moment, I'll have to observe this a bit more to make the observations more precise.
- It seems that this happened after a warning about an empty response from the model.
- It seems to happen when using e.g.
openrouter/mistralai/devstral-2512:free but not so much when using openai/gpt-5-mini
Session file extract
I found the following fragment in the session file, maybe this is helpful:
{
"role": "tool",
"tool_call_id": "mMhnRMwnh",
"content": "Task Finished!"
},
{
"content": "",
"role": "assistant",
"tool_calls": null,
"function_call": null,
"provider_specific_fields": null,
"reasoning_content": ""
},
{
"role": "user",
"content": "<context name=\"user_input\">\nAdd a hint into the agents file about how to build the documentation\n</context>"
}
I think it got stuck after the item in the middle which is empty everywhere.
Details about my setup
Aider-CE version: 0.90.5.dev+less
On commit 2bef3be
Python version: 3.13.7
Platform: macOS-14.7-arm64-arm-64bit-Mach-O
Python implementation: CPython
Virtual environment: Yes
OS: Darwin 23.6.0 (64bit)
Git version: git version 2.51.0
Configuration:
35turbo: False
4: False
4_turbo: False
4o: False
add_gitignore_files: False
agent_config: {
"large_file_token_threshold": 12500,
"skip_cli_confirmations": false
}
aiderignore: /Users/johannes/wbt/infra/infra-ops/.aiderignore
alias: None
analytics: False
analytics_disable: False
analytics_log: None
analytics_posthog_host: None
apply: None
apply_clipboard_edits: False
assistant_output_color: #0088ff
attribute_author: False
attribute_co_authored_by: False
attribute_commit_message_author: False
attribute_commit_message_committer: False
attribute_committer: False
auto_accept_architect: True
auto_commits: True
auto_lint: True
auto_load: False
auto_save: True
auto_save_session_name: auto-save
auto_test: False
cache_keepalive_pings: 0
cache_prompts: True
chat_history_file: /Users/johannes/wbt/infra/infra-ops/.aider.chat.history.md
chat_language: None
check_model_accepts_settings: True
check_update: True
code_theme: default
command_prefix: None
commit: False
commit_language: None
commit_prompt: None
completion_menu_bg_color: None
completion_menu_color: None
completion_menu_current_bg_color: None
completion_menu_current_color: None
config: None
context_compaction_max_tokens: 209715
context_compaction_summary_tokens: 4096
copy_paste: False
dark_mode: False
debug: False
deepseek: False
detect_urls: True
dirty_commits: True
disable_playwright: False
dry_run: False
edit_format: agent
editor: None
editor_edit_format: None
editor_model: None
enable_context_compaction: True
encoding: utf-8
env_file: /.aider.env
exit: False
fancy_input: True
file: None
files: []
git: True
git_commit_verify: False
gitignore: True
haiku: False
input_history_file: /Users/johannes/wbt/infra/infra-ops/.aider.input.history
install_main_branch: False
just_check_update: False
light_mode: False
line_endings: platform
linear_output: True
lint: False
lint_cmd: []
list_models: None
llm_history_file: None
load: None
map_cache_dir: .
map_max_line_length: 100
map_memory_cache: False
map_multiplier_no_files: 2
map_refresh: files
map_tokens: None
max_chat_history_tokens: None
mcp_servers: ['context7', 'fetch', 'nixos']
mcp_servers_file: None
mcp_transport: stdio
message: None
message_file: None
mini: False
model: openrouter/mistralai/devstral-2512:free
model_metadata_file: .aider.model.metadata.json
model_overrides: None
model_overrides_file: .aider.model.overrides.yml
model_settings_file: .aider.model.settings.yml
multiline: False
notifications: False
notifications_command: None
o1_mini: False
o1_preview: False
openai_api_base: None
openai_api_deployment_id: None
openai_api_type: None
openai_api_version: None
openai_organization_id: None
opus: False
preserve_todo_list: False
pretty: True
read: ['/.codex/AGENTS.md']
reasoning_effort: None
restore_chat_history: False
set_env: []
shell_completions: None
show_diffs: False
show_model_warnings: True
show_prompts: False
show_release_notes: None
show_repo_map: False
skip_sanity_check_repo: False
sonnet: False
stream: True
subtree_only: False
suggest_shell_commands: True
test: False
test_cmd: []
thinking_tokens: None
timeout: None
tool_error_color: #FF2222
tool_output_color: None
tool_warning_color: #FFA500
tui: None
tui_config: None
tweak_responses: False
upgrade: False
use_enhanced_map: False
user_input_color: #00cc00
verbose: False
verify_ssl: True
vim: False
voice_format: wav
voice_input_device: None
voice_language: en
watch_files: False
weak_model: None
yes_always: None
yes_always_commands: False
cecli v0.90.5.dev+less
Model: openrouter/mistralai/devstral-2512:free with agent edit format
Git repo: .git with 49 files
Repo-map: using 4096 tokens, files refresh
MCP servers configured: context7, fetch, nixos, local_tools
Restored previous conversation history.
I am getting stuck sometimes when working in agent mode with the following error:
Further observations
Still assumptions at the moment, I'll have to observe this a bit more to make the observations more precise.
openrouter/mistralai/devstral-2512:freebut not so much when usingopenai/gpt-5-miniSession file extract
I found the following fragment in the session file, maybe this is helpful:
{ "role": "tool", "tool_call_id": "mMhnRMwnh", "content": "Task Finished!" }, { "content": "", "role": "assistant", "tool_calls": null, "function_call": null, "provider_specific_fields": null, "reasoning_content": "" }, { "role": "user", "content": "<context name=\"user_input\">\nAdd a hint into the agents file about how to build the documentation\n</context>" }I think it got stuck after the item in the middle which is empty everywhere.
Details about my setup
Aider-CE version: 0.90.5.dev+less
On commit 2bef3be
Python version: 3.13.7
Platform: macOS-14.7-arm64-arm-64bit-Mach-O
Python implementation: CPython
Virtual environment: Yes
OS: Darwin 23.6.0 (64bit)
Git version: git version 2.51.0
Configuration:
35turbo: False
4: False
4_turbo: False
4o: False
add_gitignore_files: False
agent_config: {
"large_file_token_threshold": 12500,
"skip_cli_confirmations": false
}
aiderignore: /Users/johannes/wbt/infra/infra-ops/.aiderignore
alias: None
analytics: False
analytics_disable: False
analytics_log: None
analytics_posthog_host: None
apply: None
apply_clipboard_edits: False
assistant_output_color: #0088ff
attribute_author: False
attribute_co_authored_by: False
attribute_commit_message_author: False
attribute_commit_message_committer: False
attribute_committer: False
auto_accept_architect: True
auto_commits: True
auto_lint: True
auto_load: False
auto_save: True
auto_save_session_name: auto-save
auto_test: False
cache_keepalive_pings: 0
cache_prompts: True
chat_history_file: /Users/johannes/wbt/infra/infra-ops/.aider.chat.history.md
chat_language: None
check_model_accepts_settings: True
check_update: True
code_theme: default
command_prefix: None
commit: False
commit_language: None
commit_prompt: None
completion_menu_bg_color: None
completion_menu_color: None
completion_menu_current_bg_color: None
completion_menu_current_color: None
config: None
context_compaction_max_tokens: 209715
context_compaction_summary_tokens: 4096
copy_paste: False
dark_mode: False
debug: False
deepseek: False
detect_urls: True
dirty_commits: True
disable_playwright: False
dry_run: False
edit_format: agent
editor: None
editor_edit_format: None
editor_model: None
enable_context_compaction: True
encoding: utf-8
env_file:
/.aider.env/.codex/AGENTS.md']exit: False
fancy_input: True
file: None
files: []
git: True
git_commit_verify: False
gitignore: True
haiku: False
input_history_file: /Users/johannes/wbt/infra/infra-ops/.aider.input.history
install_main_branch: False
just_check_update: False
light_mode: False
line_endings: platform
linear_output: True
lint: False
lint_cmd: []
list_models: None
llm_history_file: None
load: None
map_cache_dir: .
map_max_line_length: 100
map_memory_cache: False
map_multiplier_no_files: 2
map_refresh: files
map_tokens: None
max_chat_history_tokens: None
mcp_servers: ['context7', 'fetch', 'nixos']
mcp_servers_file: None
mcp_transport: stdio
message: None
message_file: None
mini: False
model: openrouter/mistralai/devstral-2512:free
model_metadata_file: .aider.model.metadata.json
model_overrides: None
model_overrides_file: .aider.model.overrides.yml
model_settings_file: .aider.model.settings.yml
multiline: False
notifications: False
notifications_command: None
o1_mini: False
o1_preview: False
openai_api_base: None
openai_api_deployment_id: None
openai_api_type: None
openai_api_version: None
openai_organization_id: None
opus: False
preserve_todo_list: False
pretty: True
read: ['
reasoning_effort: None
restore_chat_history: False
set_env: []
shell_completions: None
show_diffs: False
show_model_warnings: True
show_prompts: False
show_release_notes: None
show_repo_map: False
skip_sanity_check_repo: False
sonnet: False
stream: True
subtree_only: False
suggest_shell_commands: True
test: False
test_cmd: []
thinking_tokens: None
timeout: None
tool_error_color: #FF2222
tool_output_color: None
tool_warning_color: #FFA500
tui: None
tui_config: None
tweak_responses: False
upgrade: False
use_enhanced_map: False
user_input_color: #00cc00
verbose: False
verify_ssl: True
vim: False
voice_format: wav
voice_input_device: None
voice_language: en
watch_files: False
weak_model: None
yes_always: None
yes_always_commands: False
cecli v0.90.5.dev+less
Model: openrouter/mistralai/devstral-2512:free with agent edit format
Git repo: .git with 49 files
Repo-map: using 4096 tokens, files refresh
MCP servers configured: context7, fetch, nixos, local_tools
Restored previous conversation history.