sorry this is on an old version so maybe ignore (I thought I'd pulled and merged this morning - maybe I should have read the message)
the precursor to this was changing models to 3 flash and triggering a (still badly formated) litelm error:
Aider-CE version: 0.90.1.dev2
Python version: 3.11.11
Platform: Linux-6.8.0-90-generic-x86_64-with-glibc2.39
Python implementation: CPython
Virtual environment: Yes
OS: Linux 6.8.0-90-generic (64bit)
Git version: git version 2.43.0
Configuration:
35turbo: False
4: False
4_turbo: False
4o: False
add_gitignore_files: False
agent_config: None
aiderignore: /home/erich/sage/.aiderignore
alias: ['flash:gemini/gemini-2.5-flash-lite', 'pro:gemini/gemini-2.5-pro', 'dschat:deepseek/deepseek-chat', 'ds:deepseek/deepseek-coder', 'dsr:deepseek/deepseek-reasoner', 'hf:huggingface/distilbert-base-uncased', 'k2i:fireworks_ai/accounts/fireworks/models/kimi-k2-instruct-0905', 'k2t:fireworks_ai/accounts/fireworks/models/kimi-k2-thinking']
analytics: None
analytics_disable: False
analytics_log: None
analytics_posthog_host: None
apply: None
apply_clipboard_edits: False
assistant_output_color: #0088ff
attribute_author: None
attribute_co_authored_by: True
attribute_commit_message_author: False
attribute_commit_message_committer: False
attribute_committer: None
auto_accept_architect: True
auto_commits: True
auto_lint: True
auto_load: False
auto_save: False
auto_save_session_name: auto-save
auto_test: False
cache_keepalive_pings: 10
cache_prompts: True
chat_history_file: /home/erich/sage/.aider.chat.history.md
chat_language: None
check_model_accepts_settings: True
check_update: False
code_theme: default
command_prefix: None
commit: False
commit_language: None
commit_prompt: None
completion_menu_bg_color: None
completion_menu_color: None
completion_menu_current_bg_color: None
completion_menu_current_color: None
config: /home/erich/bin/aider.yml
context_compaction_max_tokens: None
context_compaction_summary_tokens: 4096
copy_paste: False
dark_mode: False
debug: True
deepseek: False
detect_urls: True
dirty_commits: True
disable_playwright: False
dry_run: False
edit_format: None
editor: None
editor_edit_format: None
editor_model: None
enable_context_compaction: False
encoding: utf-8
env_file: /home/erich/sage/.env
exit: False
fancy_input: True
file: None
files: []
git: True
git_commit_verify: False
gitignore: True
haiku: False
input_history_file: /home/erich/sage/.aider.input.history
install_main_branch: False
just_check_update: False
light_mode: False
line_endings: platform
linear_output: True
lint: False
lint_cmd: ['/bin/lint-all']
list_models: None
llm_history_file: .aider.llm.history
load: None
map_cache_dir: .
map_max_line_length: 100
map_memory_cache: False
map_multiplier_no_files: 2
map_refresh: files
map_tokens: None
max_chat_history_tokens: None
mcp_servers: None
mcp_servers_file: None
mcp_transport: stdio
message: None
message_file: None
mini: False
model: gemini/gemini-3-pro-preview
model_metadata_file: /home/erich/bin/aider.model.metadata.json
model_overrides: None
model_overrides_file: .aider.model.overrides.yml
model_settings_file: /home/erich/bin/aider.model.settings.yml
multiline: True
notifications: False
notifications_command: None
o1_mini: False
o1_preview: False
openai_api_base: None
openai_api_deployment_id: None
openai_api_type: None
openai_api_version: None
openai_organization_id: None
opus: False
preserve_todo_list: False
pretty: True
read: ['/bin/convention.md']
reasoning_effort: None
restore_chat_history: False
set_env: []
shell_completions: None
show_diffs: False
show_model_warnings: True
show_prompts: False
show_release_notes: None
show_repo_map: False
skip_sanity_check_repo: False
sonnet: False
stream: True
subtree_only: False
suggest_shell_commands: True
test: False
test_cmd: []
thinking_tokens: None
timeout: None
tool_error_color: #FF2222
tool_output_color: None
tool_warning_color: #FFA500
tui: False
tui_config: None
tweak_responses: False
upgrade: False
use_enhanced_map: False
user_input_color: #00cc00
verbose: False
verify_ssl: True
vim: False
voice_format: wav
voice_input_device: None
voice_language: en
watch_files: False
weak_model: gemini/gemini-2.5-flash-lite
yes_always: None
yes_always_commands: False
An uncaught exception occurred:
Traceback (most recent call last):
File "base_coder.py", line 2818, in add_assistant_reply_to_cur_messages
response_dict = response.model_dump()
^^^^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'model_dump'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "cecli", line 10, in <module>
sys.exit(main())
^^^^^^
File "main.py", line 605, in main
return asyncio.run(main_async(argv, input, output, force_git_root, return_coder))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "runners.py", line 190, in run
return runner.run(main)
^^^^^^^^^^^^^^^^
File "runners.py", line 118, in run
return self._loop.run_until_complete(task)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "base_events.py", line 654, in run_until_complete
return future.result()
^^^^^^^^^^^^^^^
File "main.py", line 1426, in main_async
await coder.run()
File "base_coder.py", line 1230, in run
return await self._run_linear(with_message, preproc)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "io.py", line 1033, in stop_output_task
await output_task
TypeError: 'NoneType' object is not iterable
sorry this is on an old version so maybe ignore (I thought I'd pulled and merged this morning - maybe I should have read the message)
the precursor to this was changing models to 3 flash and triggering a (still badly formated) litelm error:
Aider-CE version: 0.90.1.dev2
Python version: 3.11.11
Platform: Linux-6.8.0-90-generic-x86_64-with-glibc2.39
Python implementation: CPython
Virtual environment: Yes
OS: Linux 6.8.0-90-generic (64bit)
Git version: git version 2.43.0
Configuration:
35turbo: False
4: False
4_turbo: False
4o: False
add_gitignore_files: False
agent_config: None
aiderignore: /home/erich/sage/.aiderignore
alias: ['flash:gemini/gemini-2.5-flash-lite', 'pro:gemini/gemini-2.5-pro', 'dschat:deepseek/deepseek-chat', 'ds:deepseek/deepseek-coder', 'dsr:deepseek/deepseek-reasoner', 'hf:huggingface/distilbert-base-uncased', 'k2i:fireworks_ai/accounts/fireworks/models/kimi-k2-instruct-0905', 'k2t:fireworks_ai/accounts/fireworks/models/kimi-k2-thinking']
analytics: None
analytics_disable: False
analytics_log: None
analytics_posthog_host: None
apply: None
apply_clipboard_edits: False
assistant_output_color: #0088ff
attribute_author: None
attribute_co_authored_by: True
attribute_commit_message_author: False
attribute_commit_message_committer: False
attribute_committer: None
auto_accept_architect: True
auto_commits: True
auto_lint: True
auto_load: False
auto_save: False
auto_save_session_name: auto-save
auto_test: False
cache_keepalive_pings: 10
cache_prompts: True
chat_history_file: /home/erich/sage/.aider.chat.history.md
chat_language: None
check_model_accepts_settings: True
check_update: False
code_theme: default
command_prefix: None
commit: False
commit_language: None
commit_prompt: None
completion_menu_bg_color: None
completion_menu_color: None
completion_menu_current_bg_color: None
completion_menu_current_color: None
config: /home/erich/bin/aider.yml
context_compaction_max_tokens: None
context_compaction_summary_tokens: 4096
copy_paste: False
dark_mode: False
debug: True
deepseek: False
detect_urls: True
dirty_commits: True
disable_playwright: False
dry_run: False
edit_format: None
editor: None
editor_edit_format: None
editor_model: None
enable_context_compaction: False
encoding: utf-8
env_file: /home/erich/sage/.env
exit: False
fancy_input: True
file: None
files: []
git: True
git_commit_verify: False
gitignore: True
haiku: False
input_history_file: /home/erich/sage/.aider.input.history
install_main_branch: False
just_check_update: False
light_mode: False
line_endings: platform
linear_output: True
lint: False
lint_cmd: ['
/bin/lint-all']/bin/convention.md']list_models: None
llm_history_file: .aider.llm.history
load: None
map_cache_dir: .
map_max_line_length: 100
map_memory_cache: False
map_multiplier_no_files: 2
map_refresh: files
map_tokens: None
max_chat_history_tokens: None
mcp_servers: None
mcp_servers_file: None
mcp_transport: stdio
message: None
message_file: None
mini: False
model: gemini/gemini-3-pro-preview
model_metadata_file: /home/erich/bin/aider.model.metadata.json
model_overrides: None
model_overrides_file: .aider.model.overrides.yml
model_settings_file: /home/erich/bin/aider.model.settings.yml
multiline: True
notifications: False
notifications_command: None
o1_mini: False
o1_preview: False
openai_api_base: None
openai_api_deployment_id: None
openai_api_type: None
openai_api_version: None
openai_organization_id: None
opus: False
preserve_todo_list: False
pretty: True
read: ['
reasoning_effort: None
restore_chat_history: False
set_env: []
shell_completions: None
show_diffs: False
show_model_warnings: True
show_prompts: False
show_release_notes: None
show_repo_map: False
skip_sanity_check_repo: False
sonnet: False
stream: True
subtree_only: False
suggest_shell_commands: True
test: False
test_cmd: []
thinking_tokens: None
timeout: None
tool_error_color: #FF2222
tool_output_color: None
tool_warning_color: #FFA500
tui: False
tui_config: None
tweak_responses: False
upgrade: False
use_enhanced_map: False
user_input_color: #00cc00
verbose: False
verify_ssl: True
vim: False
voice_format: wav
voice_input_device: None
voice_language: en
watch_files: False
weak_model: gemini/gemini-2.5-flash-lite
yes_always: None
yes_always_commands: False
An uncaught exception occurred: