diff --git a/.gitignore b/.gitignore
index b34f19f0644..2c147719a65 100644
--- a/.gitignore
+++ b/.gitignore
@@ -18,6 +18,7 @@
!/.flake8
!/.gitignore
!/.pre-commit-config.yaml
+!/CHANGELOG.md
!/CNAME
!/CONTRIBUTING.metadata
!/HISTORY.md
diff --git a/CHANGELOG.md b/CHANGELOG.md
new file mode 100644
index 00000000000..988b4e2c460
--- /dev/null
+++ b/CHANGELOG.md
@@ -0,0 +1,32 @@
+### Merged PRs
+
+* [MCP: #3937](https://github.com/Aider-AI/aider/pull/3937)
+ * [MCP Multi Tool Response](https://github.com/quinlanjager/aider/pull/1)
+* [Navigator Mode: #3781](https://github.com/Aider-AI/aider/pull/3781)
+ * [Navigator Mode Large File Count](https://github.com/Aider-AI/aider/commit/b88a7bda649931798209945d9687718316c7427f)
+ * [Fix navigator mode auto commit](https://github.com/dwash96/aider-ce/issues/38)
+* [Qwen 3: #4383](https://github.com/Aider-AI/aider/pull/4383)
+* [Fuzzy Search: #4366](https://github.com/Aider-AI/aider/pull/4366)
+* [Map Cache Location Config: #2911](https://github.com/Aider-AI/aider/pull/2911)
+* [Enhanced System Prompts: #3804](https://github.com/Aider-AI/aider/pull/3804)
+* [Repo Map File Name Truncation Fix: #4320](https://github.com/Aider-AI/aider/pull/4320)
+* [Read Only Stub Files For Context Window Management : #3056](https://github.com/Aider-AI/aider/pull/3056)
+
+### Other Updates
+
+* [Added Remote MCP Tool Calls With HTTP Streaming](https://github.com/Aider-AI/aider/commit/a86039f73579df7c32fee910967827c9fccdec0d)
+ * [Enforce single tool call at a time](https://github.com/Aider-AI/aider/commit/3346c3e6194096cef64b1899b017bde36a65f794)
+ * [Upgraded MCP dep to 1.12.3 for Remote MCP Tool Calls](https://github.com/dwash96/aider-ce/commit/a91ee1c03627a31093364fd2a09e654781b1b879)
+ * [Updated base Python version to 3.12 to better support navigator mode (might consider undoing this, if dependency list supports it)](https://github.com/dwash96/aider-ce/commit/9ed416d523c11362a3ba9fc4c02134e0e79d41fc)
+* [Suppress LiteLLM asyncio errors that clutter output](https://github.com/Aider-AI/aider/issues/6)
+* [Updated Docker File Build Process](https://github.com/Aider-AI/aider/commit/cbab01458d0a35c03b30ac2f6347a74fc2b9f662)
+ * [Manually install necessary ubuntu dependencies](https://github.com/dwash96/aider-ce/issues/14)
+* [.gitignore updates](https://github.com/dwash96/aider-ce/commit/7c7e803fa63d1acd860eef1423e5a03220df6017)
+* [Experimental Context Compaction For Longer Running Generation Tasks](https://github.com/Aider-AI/aider/issues/6)
+* [Edit Before Adding Files and Reflecting](https://github.com/dwash96/aider-ce/pull/22)
+* [Fix Deepseek model configurations](https://github.com/Aider-AI/aider/commit/c839a6dd8964d702172cae007375e299732d3823)
+* [Relax Version Pinning For Easier Distribution](https://github.com/dwash96/aider-ce/issues/18)
+* [Remove Confirm Responses from History](https://github.com/Aider-AI/aider/pull/3958)
+* [Benchmark Results By Language](https://github.com/dwash96/aider-ce/pull/27)
+* [Allow Benchmarks to Use Repo Map For Better Accuracy](https://github.com/dwash96/aider-ce/pull/25)
+* [Read File Globbing](https://github.com/Aider-AI/aider/pull/3395)
diff --git a/README.md b/README.md
index 3e37fb6f672..2026c3ef340 100644
--- a/README.md
+++ b/README.md
@@ -1,3 +1,34 @@
+### Documentation and Other Notes
+* [Agent Mode](https://github.com/dwash96/aider-ce/blob/main/aider/website/docs/config/agent-mode.md)
+* [MCP Configuration](https://github.com/dwash96/aider-ce/blob/main/aider/website/docs/config/mcp.md)
+* [Session Management](https://github.com/dwash96/aider-ce/blob/main/aider/website/docs/sessions.md)
+* [Aider Original Documentation (still mostly applies)](https://aider.chat/)
+* [Changelog](https://github.com/dwash96/aider-ce/blob/main/CHANGELOG.md)
+* [Discord Community](https://discord.gg/McwdCRuqkJ)
+
+### Installation Instructions
+This project can be installed using several methods:
+
+### Package Installation
+```bash
+pip install aider-ce
+```
+
+or
+
+```bash
+uv pip install aider-ce
+```
+
+The package exports an `aider-ce` command that accepts all of Aider's configuration options
+
+### Tool Installation
+```bash
+uv tool install --python python3.12 aider-ce
+```
+
+Use the tool installation so aider doesn't interfere with your development environment
+
## Project Roadmap/Goals
The current priorities are to improve core capabilities and user experience of the Aider project
@@ -35,72 +66,35 @@ The current priorities are to improve core capabilities and user experience of t
* [ ] Add a plugin-like system for allowing agent mode to use user-defined tools in simple python files
* [ ] Add a dynamic tool discovery tool to allow the system to have only the tools it needs in context
-### Documentation and Other Notes
-* [Agent Mode](https://github.com/dwash96/aider-ce/blob/main/aider/website/docs/config/agent-mode.md)
-* [MCP Configuration](https://github.com/dwash96/aider-ce/blob/main/aider/website/docs/config/mcp.md)
-* [Session Management](https://github.com/dwash96/aider-ce/blob/main/aider/website/docs/sessions.md)
-* [Aider Original Documentation (still mostly applies)](https://aider.chat/)
-* [Discord Community](https://discord.gg/McwdCRuqkJ)
-
-### Installation Instructions
-This project can be installed using several methods:
-
-### Package Installation
-```bash
-pip install aider-ce
-```
-
-or
-
-```bash
-uv pip install aider-ce
-```
-
-The package exports an `aider-ce` command that accepts all of Aider's configuration options
-
-### Tool Installation
-```bash
-uv tool install --python python3.12 aider-ce
-```
-
-Use the tool installation so aider doesn't interfere with your development environment
-
-### Merged PRs
-
-* [MCP: #3937](https://github.com/Aider-AI/aider/pull/3937)
- * [MCP Multi Tool Response](https://github.com/quinlanjager/aider/pull/1)
-* [Navigator Mode: #3781](https://github.com/Aider-AI/aider/pull/3781)
- * [Navigator Mode Large File Count](https://github.com/Aider-AI/aider/commit/b88a7bda649931798209945d9687718316c7427f)
- * [Fix navigator mode auto commit](https://github.com/dwash96/aider-ce/issues/38)
-* [Qwen 3: #4383](https://github.com/Aider-AI/aider/pull/4383)
-* [Fuzzy Search: #4366](https://github.com/Aider-AI/aider/pull/4366)
-* [Map Cache Location Config: #2911](https://github.com/Aider-AI/aider/pull/2911)
-* [Enhanced System Prompts: #3804](https://github.com/Aider-AI/aider/pull/3804)
-* [Repo Map File Name Truncation Fix: #4320](https://github.com/Aider-AI/aider/pull/4320)
-* [Read Only Stub Files For Context Window Management : #3056](https://github.com/Aider-AI/aider/pull/3056)
-
-### Other Updates
-
-* [Added Remote MCP Tool Calls With HTTP Streaming](https://github.com/Aider-AI/aider/commit/a86039f73579df7c32fee910967827c9fccdec0d)
- * [Enforce single tool call at a time](https://github.com/Aider-AI/aider/commit/3346c3e6194096cef64b1899b017bde36a65f794)
- * [Upgraded MCP dep to 1.12.3 for Remote MCP Tool Calls](https://github.com/dwash96/aider-ce/commit/a91ee1c03627a31093364fd2a09e654781b1b879)
- * [Updated base Python version to 3.12 to better support navigator mode (might consider undoing this, if dependency list supports it)](https://github.com/dwash96/aider-ce/commit/9ed416d523c11362a3ba9fc4c02134e0e79d41fc)
-* [Suppress LiteLLM asyncio errors that clutter output](https://github.com/Aider-AI/aider/issues/6)
-* [Updated Docker File Build Process](https://github.com/Aider-AI/aider/commit/cbab01458d0a35c03b30ac2f6347a74fc2b9f662)
- * [Manually install necessary ubuntu dependencies](https://github.com/dwash96/aider-ce/issues/14)
-* [.gitignore updates](https://github.com/dwash96/aider-ce/commit/7c7e803fa63d1acd860eef1423e5a03220df6017)
-* [Experimental Context Compaction For Longer Running Generation Tasks](https://github.com/Aider-AI/aider/issues/6)
-* [Edit Before Adding Files and Reflecting](https://github.com/dwash96/aider-ce/pull/22)
-* [Fix Deepseek model configurations](https://github.com/Aider-AI/aider/commit/c839a6dd8964d702172cae007375e299732d3823)
-* [Relax Version Pinning For Easier Distribution](https://github.com/dwash96/aider-ce/issues/18)
-* [Remove Confirm Responses from History](https://github.com/Aider-AI/aider/pull/3958)
-* [Benchmark Results By Language](https://github.com/dwash96/aider-ce/pull/27)
-* [Allow Benchmarks to Use Repo Map For Better Accuracy](https://github.com/dwash96/aider-ce/pull/25)
-* [Read File Globbing](https://github.com/Aider-AI/aider/pull/3395)
-
### All Contributors (Both Aider Main and Aider-CE)
-
-
-
-
+@paul-gauthier
+@dwash96
+@tekacs
+@ei-grad
+@joshuavial
+@chr15m
+@fry69
+@quinlanjager
+@caseymcc
+@shladnik
+@itlackey
+@tomjuggler
+@vk4s
+@titusz
+@daniel-vainsencher
+@bphd
+@akaihola
+@jalammar
+@schpet
+@iamFIREcracker
+@KennyDizi
+@ivanfioravanti
+@mdeweerd
+@fahmad91
+@itsmeknt
+@cheahjs
+@youknow04
+@pcamp
+@miradnanali
+@o-nix
\ No newline at end of file
diff --git a/aider/__init__.py b/aider/__init__.py
index c7b30bd3c60..aa833be4964 100644
--- a/aider/__init__.py
+++ b/aider/__init__.py
@@ -1,6 +1,6 @@
from packaging import version
-__version__ = "0.88.24.dev"
+__version__ = "0.88.25.dev"
safe_version = __version__
try:
diff --git a/aider/coders/agent_coder.py b/aider/coders/agent_coder.py
index 17ee14ac641..596c6c28f6e 100644
--- a/aider/coders/agent_coder.py
+++ b/aider/coders/agent_coder.py
@@ -1915,7 +1915,7 @@ def get_directory_structure(self):
if line.startswith("??"):
# Extract the filename (remove the '?? ' prefix)
untracked_file = line[3:]
- if not self.repo.git_ignored_file(untracked_file):
+ if not self.repo.ignored_file(untracked_file):
untracked_files.append(untracked_file)
except Exception as e:
self.io.tool_warning(f"Error getting untracked files: {str(e)}")
diff --git a/aider/coders/base_coder.py b/aider/coders/base_coder.py
index 85628b0f6a5..43a316f469b 100755
--- a/aider/coders/base_coder.py
+++ b/aider/coders/base_coder.py
@@ -921,7 +921,7 @@ def _include_in_map(abs_path):
return False
if ".min." in parts[-1]:
return False
- if self.repo.git_ignored_file(abs_path):
+ if self.repo.ignored_file(abs_path):
return False
return True
diff --git a/aider/diffs.py b/aider/diffs.py
index 46266ac6780..709810452a0 100644
--- a/aider/diffs.py
+++ b/aider/diffs.py
@@ -63,17 +63,18 @@ def diff_partial_update(lines_orig, lines_updated, final=False, fname=None):
if last_non_deleted is None:
return ""
- if num_orig_lines:
- pct = last_non_deleted * 100 / num_orig_lines
- else:
- pct = 50
- bar = create_progress_bar(pct)
- bar = f" {last_non_deleted:3d} / {num_orig_lines:3d} lines [{bar}] {pct:3.0f}%\n"
+ # if num_orig_lines:
+ # pct = last_non_deleted * 100 / num_orig_lines
+ # else:
+ # pct = 50
+ # bar = create_progress_bar(pct)
+ # bar = f" {last_non_deleted:3d} / {num_orig_lines:3d} lines [{bar}] {pct:3.0f}%\n"
lines_orig = lines_orig[:last_non_deleted]
if not final:
- lines_updated = lines_updated[:-1] + [bar]
+ # lines_updated = lines_updated[:-1] + [bar]
+ lines_updated = lines_updated[:-1]
diff = difflib.unified_diff(lines_orig, lines_updated, n=5)
@@ -88,14 +89,14 @@ def diff_partial_update(lines_orig, lines_updated, final=False, fname=None):
if backticks not in diff:
break
- show = f"{backticks}diff\n"
+ show = "diff\n"
if fname:
show += f"--- {fname} original\n"
show += f"+++ {fname} updated\n"
show += diff
- show += f"{backticks}\n\n"
+ show += "\n\n"
# print(diff)
diff --git a/aider/models.py b/aider/models.py
index 8f5380136ca..de6ccd0946e 100644
--- a/aider/models.py
+++ b/aider/models.py
@@ -12,7 +12,6 @@
from pathlib import Path
from typing import Optional, Union
-import json5
import yaml
from PIL import Image
@@ -1095,7 +1094,7 @@ def register_litellm_models(model_fnames):
data = Path(model_fname).read_text()
if not data.strip():
continue
- model_def = json5.loads(data)
+ model_def = json.loads(data)
if not model_def:
continue
diff --git a/aider/resources/model-metadata.json b/aider/resources/model-metadata.json
index 2a8c960e45a..199c02f2f52 100644
--- a/aider/resources/model-metadata.json
+++ b/aider/resources/model-metadata.json
@@ -1297,6 +1297,132 @@
"supports_tool_choice": true,
"supports_vision": true
},
+ "azure/eu/gpt-5.1": {
+ "cache_read_input_token_cost": 1.4e-07,
+ "input_cost_per_token": 1.38e-06,
+ "litellm_provider": "azure",
+ "max_input_tokens": 272000,
+ "max_output_tokens": 128000,
+ "max_tokens": 128000,
+ "mode": "chat",
+ "output_cost_per_token": 1.1e-05,
+ "supported_endpoints": [
+ "/v1/chat/completions",
+ "/v1/batch",
+ "/v1/responses"
+ ],
+ "supported_modalities": [
+ "text",
+ "image"
+ ],
+ "supported_output_modalities": [
+ "text",
+ "image"
+ ],
+ "supports_function_calling": true,
+ "supports_native_streaming": true,
+ "supports_parallel_function_calling": true,
+ "supports_pdf_input": true,
+ "supports_prompt_caching": true,
+ "supports_reasoning": true,
+ "supports_response_schema": true,
+ "supports_system_messages": true,
+ "supports_tool_choice": true,
+ "supports_vision": true
+ },
+ "azure/eu/gpt-5.1-chat": {
+ "cache_read_input_token_cost": 1.4e-07,
+ "input_cost_per_token": 1.38e-06,
+ "litellm_provider": "azure",
+ "max_input_tokens": 272000,
+ "max_output_tokens": 128000,
+ "max_tokens": 128000,
+ "mode": "chat",
+ "output_cost_per_token": 1.1e-05,
+ "supported_endpoints": [
+ "/v1/chat/completions",
+ "/v1/batch",
+ "/v1/responses"
+ ],
+ "supported_modalities": [
+ "text",
+ "image"
+ ],
+ "supported_output_modalities": [
+ "text",
+ "image"
+ ],
+ "supports_function_calling": true,
+ "supports_native_streaming": true,
+ "supports_parallel_function_calling": true,
+ "supports_pdf_input": true,
+ "supports_prompt_caching": true,
+ "supports_reasoning": true,
+ "supports_response_schema": true,
+ "supports_system_messages": true,
+ "supports_tool_choice": true,
+ "supports_vision": true
+ },
+ "azure/eu/gpt-5.1-codex": {
+ "cache_read_input_token_cost": 1.4e-07,
+ "input_cost_per_token": 1.38e-06,
+ "litellm_provider": "azure",
+ "max_input_tokens": 272000,
+ "max_output_tokens": 128000,
+ "max_tokens": 128000,
+ "mode": "responses",
+ "output_cost_per_token": 1.1e-05,
+ "supported_endpoints": [
+ "/v1/responses"
+ ],
+ "supported_modalities": [
+ "text",
+ "image"
+ ],
+ "supported_output_modalities": [
+ "text"
+ ],
+ "supports_function_calling": true,
+ "supports_native_streaming": true,
+ "supports_parallel_function_calling": true,
+ "supports_pdf_input": true,
+ "supports_prompt_caching": true,
+ "supports_reasoning": true,
+ "supports_response_schema": true,
+ "supports_system_messages": false,
+ "supports_tool_choice": true,
+ "supports_vision": true
+ },
+ "azure/eu/gpt-5.1-codex-mini": {
+ "cache_read_input_token_cost": 2.8e-08,
+ "input_cost_per_token": 2.75e-07,
+ "litellm_provider": "azure",
+ "max_input_tokens": 272000,
+ "max_output_tokens": 128000,
+ "max_tokens": 128000,
+ "mode": "responses",
+ "output_cost_per_token": 2.2e-06,
+ "supported_endpoints": [
+ "/v1/responses"
+ ],
+ "supported_modalities": [
+ "text",
+ "image"
+ ],
+ "supported_output_modalities": [
+ "text"
+ ],
+ "supports_function_calling": true,
+ "supports_native_streaming": true,
+ "supports_parallel_function_calling": true,
+ "supports_pdf_input": true,
+ "supports_prompt_caching": true,
+ "supports_reasoning": true,
+ "supports_response_schema": true,
+ "supports_system_messages": false,
+ "supports_tool_choice": true,
+ "supports_vision": true
+ },
"azure/eu/gpt-5-nano-2025-08-07": {
"cache_read_input_token_cost": 5.5e-09,
"input_cost_per_token": 5.5e-08,
@@ -1471,6 +1597,132 @@
"supports_tool_choice": true,
"supports_vision": true
},
+ "azure/global/gpt-5.1": {
+ "cache_read_input_token_cost": 1.25e-07,
+ "input_cost_per_token": 1.25e-06,
+ "litellm_provider": "azure",
+ "max_input_tokens": 272000,
+ "max_output_tokens": 128000,
+ "max_tokens": 128000,
+ "mode": "chat",
+ "output_cost_per_token": 1e-05,
+ "supported_endpoints": [
+ "/v1/chat/completions",
+ "/v1/batch",
+ "/v1/responses"
+ ],
+ "supported_modalities": [
+ "text",
+ "image"
+ ],
+ "supported_output_modalities": [
+ "text",
+ "image"
+ ],
+ "supports_function_calling": true,
+ "supports_native_streaming": true,
+ "supports_parallel_function_calling": true,
+ "supports_pdf_input": true,
+ "supports_prompt_caching": true,
+ "supports_reasoning": true,
+ "supports_response_schema": true,
+ "supports_system_messages": true,
+ "supports_tool_choice": true,
+ "supports_vision": true
+ },
+ "azure/global/gpt-5.1-chat": {
+ "cache_read_input_token_cost": 1.25e-07,
+ "input_cost_per_token": 1.25e-06,
+ "litellm_provider": "azure",
+ "max_input_tokens": 272000,
+ "max_output_tokens": 128000,
+ "max_tokens": 128000,
+ "mode": "chat",
+ "output_cost_per_token": 1e-05,
+ "supported_endpoints": [
+ "/v1/chat/completions",
+ "/v1/batch",
+ "/v1/responses"
+ ],
+ "supported_modalities": [
+ "text",
+ "image"
+ ],
+ "supported_output_modalities": [
+ "text",
+ "image"
+ ],
+ "supports_function_calling": true,
+ "supports_native_streaming": true,
+ "supports_parallel_function_calling": true,
+ "supports_pdf_input": true,
+ "supports_prompt_caching": true,
+ "supports_reasoning": true,
+ "supports_response_schema": true,
+ "supports_system_messages": true,
+ "supports_tool_choice": true,
+ "supports_vision": true
+ },
+ "azure/global/gpt-5.1-codex": {
+ "cache_read_input_token_cost": 1.25e-07,
+ "input_cost_per_token": 1.25e-06,
+ "litellm_provider": "azure",
+ "max_input_tokens": 272000,
+ "max_output_tokens": 128000,
+ "max_tokens": 128000,
+ "mode": "responses",
+ "output_cost_per_token": 1e-05,
+ "supported_endpoints": [
+ "/v1/responses"
+ ],
+ "supported_modalities": [
+ "text",
+ "image"
+ ],
+ "supported_output_modalities": [
+ "text"
+ ],
+ "supports_function_calling": true,
+ "supports_native_streaming": true,
+ "supports_parallel_function_calling": true,
+ "supports_pdf_input": true,
+ "supports_prompt_caching": true,
+ "supports_reasoning": true,
+ "supports_response_schema": true,
+ "supports_system_messages": false,
+ "supports_tool_choice": true,
+ "supports_vision": true
+ },
+ "azure/global/gpt-5.1-codex-mini": {
+ "cache_read_input_token_cost": 2.5e-08,
+ "input_cost_per_token": 2.5e-07,
+ "litellm_provider": "azure",
+ "max_input_tokens": 272000,
+ "max_output_tokens": 128000,
+ "max_tokens": 128000,
+ "mode": "responses",
+ "output_cost_per_token": 2e-06,
+ "supported_endpoints": [
+ "/v1/responses"
+ ],
+ "supported_modalities": [
+ "text",
+ "image"
+ ],
+ "supported_output_modalities": [
+ "text"
+ ],
+ "supports_function_calling": true,
+ "supports_native_streaming": true,
+ "supports_parallel_function_calling": true,
+ "supports_pdf_input": true,
+ "supports_prompt_caching": true,
+ "supports_reasoning": true,
+ "supports_response_schema": true,
+ "supports_system_messages": false,
+ "supports_tool_choice": true,
+ "supports_vision": true
+ },
"azure/gpt-3.5-turbo": {
"input_cost_per_token": 5e-07,
"litellm_provider": "azure",
@@ -2672,7 +2924,133 @@
"image"
],
"supported_output_modalities": [
- "text"
+ "text"
+ ],
+ "supports_function_calling": true,
+ "supports_native_streaming": true,
+ "supports_parallel_function_calling": true,
+ "supports_pdf_input": true,
+ "supports_prompt_caching": true,
+ "supports_reasoning": true,
+ "supports_response_schema": true,
+ "supports_system_messages": true,
+ "supports_tool_choice": true,
+ "supports_vision": true
+ },
+ "azure/gpt-5-nano": {
+ "cache_read_input_token_cost": 5e-09,
+ "input_cost_per_token": 5e-08,
+ "litellm_provider": "azure",
+ "max_input_tokens": 272000,
+ "max_output_tokens": 128000,
+ "max_tokens": 128000,
+ "mode": "chat",
+ "output_cost_per_token": 4e-07,
+ "supported_endpoints": [
+ "/v1/chat/completions",
+ "/v1/batch",
+ "/v1/responses"
+ ],
+ "supported_modalities": [
+ "text",
+ "image"
+ ],
+ "supported_output_modalities": [
+ "text"
+ ],
+ "supports_function_calling": true,
+ "supports_native_streaming": true,
+ "supports_parallel_function_calling": true,
+ "supports_pdf_input": true,
+ "supports_prompt_caching": true,
+ "supports_reasoning": true,
+ "supports_response_schema": true,
+ "supports_system_messages": true,
+ "supports_tool_choice": true,
+ "supports_vision": true
+ },
+ "azure/gpt-5-nano-2025-08-07": {
+ "cache_read_input_token_cost": 5e-09,
+ "input_cost_per_token": 5e-08,
+ "litellm_provider": "azure",
+ "max_input_tokens": 272000,
+ "max_output_tokens": 128000,
+ "max_tokens": 128000,
+ "mode": "chat",
+ "output_cost_per_token": 4e-07,
+ "supported_endpoints": [
+ "/v1/chat/completions",
+ "/v1/batch",
+ "/v1/responses"
+ ],
+ "supported_modalities": [
+ "text",
+ "image"
+ ],
+ "supported_output_modalities": [
+ "text"
+ ],
+ "supports_function_calling": true,
+ "supports_native_streaming": true,
+ "supports_parallel_function_calling": true,
+ "supports_pdf_input": true,
+ "supports_prompt_caching": true,
+ "supports_reasoning": true,
+ "supports_response_schema": true,
+ "supports_system_messages": true,
+ "supports_tool_choice": true,
+ "supports_vision": true
+ },
+ "azure/gpt-5-pro": {
+ "input_cost_per_token": 1.5e-05,
+ "litellm_provider": "azure",
+ "max_input_tokens": 272000,
+ "max_output_tokens": 128000,
+ "max_tokens": 400000,
+ "mode": "responses",
+ "output_cost_per_token": 0.00012,
+ "source": "https://learn.microsoft.com/en-us/azure/ai-foundry/foundry-models/concepts/models-sold-directly-by-azure?pivots=azure-openai&tabs=global-standard-aoai%2Cstandard-chat-completions%2Cglobal-standard#gpt-5",
+ "supported_endpoints": [
+ "/v1/responses"
+ ],
+ "supported_modalities": [
+ "text",
+ "image"
+ ],
+ "supported_output_modalities": [
+ "text"
+ ],
+ "supports_function_calling": true,
+ "supports_parallel_function_calling": true,
+ "supports_pdf_input": true,
+ "supports_prompt_caching": true,
+ "supports_reasoning": true,
+ "supports_response_schema": true,
+ "supports_system_messages": true,
+ "supports_tool_choice": true,
+ "supports_vision": true
+ },
+ "azure/gpt-5.1": {
+ "cache_read_input_token_cost": 1.25e-07,
+ "input_cost_per_token": 1.25e-06,
+ "litellm_provider": "azure",
+ "max_input_tokens": 272000,
+ "max_output_tokens": 128000,
+ "max_tokens": 128000,
+ "mode": "chat",
+ "output_cost_per_token": 1e-05,
+ "supported_endpoints": [
+ "/v1/chat/completions",
+ "/v1/batch",
+ "/v1/responses"
+ ],
+ "supported_modalities": [
+ "text",
+ "image"
+ ],
+ "supported_output_modalities": [
+ "text",
+ "image"
],
"supports_function_calling": true,
"supports_native_streaming": true,
@@ -2685,15 +3063,15 @@
"supports_tool_choice": true,
"supports_vision": true
},
- "azure/gpt-5-nano": {
- "cache_read_input_token_cost": 5e-09,
- "input_cost_per_token": 5e-08,
+ "azure/gpt-5.1-chat": {
+ "cache_read_input_token_cost": 1.25e-07,
+ "input_cost_per_token": 1.25e-06,
"litellm_provider": "azure",
"max_input_tokens": 272000,
"max_output_tokens": 128000,
"max_tokens": 128000,
"mode": "chat",
- "output_cost_per_token": 4e-07,
+ "output_cost_per_token": 1e-05,
"supported_endpoints": [
"/v1/chat/completions",
"/v1/batch",
@@ -2704,7 +3082,8 @@
"image"
],
"supported_output_modalities": [
- "text"
+ "text",
+ "image"
],
"supports_function_calling": true,
"supports_native_streaming": true,
@@ -2717,18 +3096,16 @@
"supports_tool_choice": true,
"supports_vision": true
},
- "azure/gpt-5-nano-2025-08-07": {
- "cache_read_input_token_cost": 5e-09,
- "input_cost_per_token": 5e-08,
+ "azure/gpt-5.1-codex": {
+ "cache_read_input_token_cost": 1.25e-07,
+ "input_cost_per_token": 1.25e-06,
"litellm_provider": "azure",
"max_input_tokens": 272000,
"max_output_tokens": 128000,
"max_tokens": 128000,
- "mode": "chat",
- "output_cost_per_token": 4e-07,
+ "mode": "responses",
+ "output_cost_per_token": 1e-05,
"supported_endpoints": [
- "/v1/chat/completions",
- "/v1/batch",
"/v1/responses"
],
"supported_modalities": [
@@ -2745,19 +3122,19 @@
"supports_prompt_caching": true,
"supports_reasoning": true,
"supports_response_schema": true,
- "supports_system_messages": true,
+ "supports_system_messages": false,
"supports_tool_choice": true,
"supports_vision": true
},
- "azure/gpt-5-pro": {
- "input_cost_per_token": 1.5e-05,
+ "azure/gpt-5.1-codex-mini": {
+ "cache_read_input_token_cost": 2.5e-08,
+ "input_cost_per_token": 2.5e-07,
"litellm_provider": "azure",
"max_input_tokens": 272000,
"max_output_tokens": 128000,
- "max_tokens": 400000,
+ "max_tokens": 128000,
"mode": "responses",
- "output_cost_per_token": 0.00012,
- "source": "https://learn.microsoft.com/en-us/azure/ai-foundry/foundry-models/concepts/models-sold-directly-by-azure?pivots=azure-openai&tabs=global-standard-aoai%2Cstandard-chat-completions%2Cglobal-standard#gpt-5",
+ "output_cost_per_token": 2e-06,
"supported_endpoints": [
"/v1/responses"
],
@@ -2769,12 +3146,13 @@
"text"
],
"supports_function_calling": true,
+ "supports_native_streaming": true,
"supports_parallel_function_calling": true,
"supports_pdf_input": true,
"supports_prompt_caching": true,
"supports_reasoning": true,
"supports_response_schema": true,
- "supports_system_messages": true,
+ "supports_system_messages": false,
"supports_tool_choice": true,
"supports_vision": true
},
@@ -3695,6 +4073,132 @@
"supports_tool_choice": true,
"supports_vision": true
},
+ "azure/us/gpt-5.1": {
+ "cache_read_input_token_cost": 1.4e-07,
+ "input_cost_per_token": 1.38e-06,
+ "litellm_provider": "azure",
+ "max_input_tokens": 272000,
+ "max_output_tokens": 128000,
+ "max_tokens": 128000,
+ "mode": "chat",
+ "output_cost_per_token": 1.1e-05,
+ "supported_endpoints": [
+ "/v1/chat/completions",
+ "/v1/batch",
+ "/v1/responses"
+ ],
+ "supported_modalities": [
+ "text",
+ "image"
+ ],
+ "supported_output_modalities": [
+ "text",
+ "image"
+ ],
+ "supports_function_calling": true,
+ "supports_native_streaming": true,
+ "supports_parallel_function_calling": true,
+ "supports_pdf_input": true,
+ "supports_prompt_caching": true,
+ "supports_reasoning": true,
+ "supports_response_schema": true,
+ "supports_system_messages": true,
+ "supports_tool_choice": true,
+ "supports_vision": true
+ },
+ "azure/us/gpt-5.1-chat": {
+ "cache_read_input_token_cost": 1.4e-07,
+ "input_cost_per_token": 1.38e-06,
+ "litellm_provider": "azure",
+ "max_input_tokens": 272000,
+ "max_output_tokens": 128000,
+ "max_tokens": 128000,
+ "mode": "chat",
+ "output_cost_per_token": 1.1e-05,
+ "supported_endpoints": [
+ "/v1/chat/completions",
+ "/v1/batch",
+ "/v1/responses"
+ ],
+ "supported_modalities": [
+ "text",
+ "image"
+ ],
+ "supported_output_modalities": [
+ "text",
+ "image"
+ ],
+ "supports_function_calling": true,
+ "supports_native_streaming": true,
+ "supports_parallel_function_calling": true,
+ "supports_pdf_input": true,
+ "supports_prompt_caching": true,
+ "supports_reasoning": true,
+ "supports_response_schema": true,
+ "supports_system_messages": true,
+ "supports_tool_choice": true,
+ "supports_vision": true
+ },
+ "azure/us/gpt-5.1-codex": {
+ "cache_read_input_token_cost": 1.4e-07,
+ "input_cost_per_token": 1.38e-06,
+ "litellm_provider": "azure",
+ "max_input_tokens": 272000,
+ "max_output_tokens": 128000,
+ "max_tokens": 128000,
+ "mode": "responses",
+ "output_cost_per_token": 1.1e-05,
+ "supported_endpoints": [
+ "/v1/responses"
+ ],
+ "supported_modalities": [
+ "text",
+ "image"
+ ],
+ "supported_output_modalities": [
+ "text"
+ ],
+ "supports_function_calling": true,
+ "supports_native_streaming": true,
+ "supports_parallel_function_calling": true,
+ "supports_pdf_input": true,
+ "supports_prompt_caching": true,
+ "supports_reasoning": true,
+ "supports_response_schema": true,
+ "supports_system_messages": false,
+ "supports_tool_choice": true,
+ "supports_vision": true
+ },
+ "azure/us/gpt-5.1-codex-mini": {
+ "cache_read_input_token_cost": 2.8e-08,
+ "input_cost_per_token": 2.75e-07,
+ "litellm_provider": "azure",
+ "max_input_tokens": 272000,
+ "max_output_tokens": 128000,
+ "max_tokens": 128000,
+ "mode": "responses",
+ "output_cost_per_token": 2.2e-06,
+ "supported_endpoints": [
+ "/v1/responses"
+ ],
+ "supported_modalities": [
+ "text",
+ "image"
+ ],
+ "supported_output_modalities": [
+ "text"
+ ],
+ "supports_function_calling": true,
+ "supports_native_streaming": true,
+ "supports_parallel_function_calling": true,
+ "supports_pdf_input": true,
+ "supports_prompt_caching": true,
+ "supports_reasoning": true,
+ "supports_response_schema": true,
+ "supports_system_messages": false,
+ "supports_tool_choice": true,
+ "supports_vision": true
+ },
"azure/us/o1-2024-12-17": {
"cache_read_input_token_cost": 8.25e-06,
"input_cost_per_token": 1.65e-05,
@@ -11281,10 +11785,12 @@
"supports_web_search": true
},
"gemini-3-pro-preview": {
- "cache_read_input_token_cost": 1.25e-07,
+ "cache_read_input_token_cost": 2e-07,
+ "cache_read_input_token_cost_above_200k_tokens": 4e-07,
"cache_creation_input_token_cost_above_200k_tokens": 2.5e-07,
"input_cost_per_token": 2e-06,
"input_cost_per_token_above_200k_tokens": 4e-06,
+ "input_cost_per_token_batches": 1e-06,
"litellm_provider": "vertex_ai-language-models",
"max_audio_length_hours": 8.4,
"max_audio_per_prompt": 1,
@@ -11298,10 +11804,60 @@
"mode": "chat",
"output_cost_per_token": 1.2e-05,
"output_cost_per_token_above_200k_tokens": 1.8e-05,
+ "output_cost_per_token_batches": 6e-06,
"source": "https://cloud.google.com/vertex-ai/generative-ai/pricing",
"supported_endpoints": [
"/v1/chat/completions",
- "/v1/completions"
+ "/v1/completions",
+ "/v1/batch"
+ ],
+ "supported_modalities": [
+ "text",
+ "image",
+ "audio",
+ "video"
+ ],
+ "supported_output_modalities": [
+ "text"
+ ],
+ "supports_audio_input": true,
+ "supports_function_calling": true,
+ "supports_pdf_input": true,
+ "supports_prompt_caching": true,
+ "supports_reasoning": true,
+ "supports_response_schema": true,
+ "supports_system_messages": true,
+ "supports_tool_choice": true,
+ "supports_video_input": true,
+ "supports_vision": true,
+ "supports_web_search": true
+ },
+ "vertex_ai/gemini-3-pro-preview": {
+ "cache_read_input_token_cost": 2e-07,
+ "cache_read_input_token_cost_above_200k_tokens": 4e-07,
+ "cache_creation_input_token_cost_above_200k_tokens": 2.5e-07,
+ "input_cost_per_token": 2e-06,
+ "input_cost_per_token_above_200k_tokens": 4e-06,
+ "input_cost_per_token_batches": 1e-06,
+ "litellm_provider": "vertex_ai",
+ "max_audio_length_hours": 8.4,
+ "max_audio_per_prompt": 1,
+ "max_images_per_prompt": 3000,
+ "max_input_tokens": 1048576,
+ "max_output_tokens": 65535,
+ "max_pdf_size_mb": 30,
+ "max_tokens": 65535,
+ "max_video_length": 1,
+ "max_videos_per_prompt": 10,
+ "mode": "chat",
+ "output_cost_per_token": 1.2e-05,
+ "output_cost_per_token_above_200k_tokens": 1.8e-05,
+ "output_cost_per_token_batches": 6e-06,
+ "source": "https://cloud.google.com/vertex-ai/generative-ai/pricing",
+ "supported_endpoints": [
+ "/v1/chat/completions",
+ "/v1/completions",
+ "/v1/batch"
],
"supported_modalities": [
"text",
@@ -12984,9 +13540,11 @@
"tpm": 800000
},
"gemini/gemini-3-pro-preview": {
- "cache_read_input_token_cost": 3.125e-07,
+ "cache_read_input_token_cost": 2e-07,
+ "cache_read_input_token_cost_above_200k_tokens": 4e-07,
"input_cost_per_token": 2e-06,
"input_cost_per_token_above_200k_tokens": 4e-06,
+ "input_cost_per_token_batches": 1e-06,
"litellm_provider": "gemini",
"max_audio_length_hours": 8.4,
"max_audio_per_prompt": 1,
@@ -13000,11 +13558,13 @@
"mode": "chat",
"output_cost_per_token": 1.2e-05,
"output_cost_per_token_above_200k_tokens": 1.8e-05,
+ "output_cost_per_token_batches": 6e-06,
"rpm": 2000,
"source": "https://cloud.google.com/vertex-ai/generative-ai/pricing",
"supported_endpoints": [
"/v1/chat/completions",
- "/v1/completions"
+ "/v1/completions",
+ "/v1/batch"
],
"supported_modalities": [
"text",
@@ -14635,7 +15195,7 @@
"max_input_tokens": 272000,
"max_output_tokens": 128000,
"max_tokens": 128000,
- "mode": "responses",
+ "mode": "chat",
"output_cost_per_token": 1e-05,
"output_cost_per_token_flex": 5e-06,
"output_cost_per_token_priority": 2e-05,
@@ -14672,7 +15232,7 @@
"max_input_tokens": 272000,
"max_output_tokens": 128000,
"max_tokens": 128000,
- "mode": "responses",
+ "mode": "chat",
"output_cost_per_token": 1e-05,
"output_cost_per_token_priority": 2e-05,
"supported_endpoints": [
@@ -14708,7 +15268,7 @@
"max_input_tokens": 272000,
"max_output_tokens": 128000,
"max_tokens": 128000,
- "mode": "responses",
+ "mode": "chat",
"output_cost_per_token": 1e-05,
"output_cost_per_token_priority": 2e-05,
"supported_endpoints": [
@@ -14744,7 +15304,7 @@
"max_input_tokens": 128000,
"max_output_tokens": 16384,
"max_tokens": 16384,
- "mode": "responses",
+ "mode": "chat",
"output_cost_per_token": 1e-05,
"output_cost_per_token_priority": 2e-05,
"supported_endpoints": [
@@ -14847,7 +15407,7 @@
"max_input_tokens": 272000,
"max_output_tokens": 128000,
"max_tokens": 128000,
- "mode": "responses",
+ "mode": "chat",
"output_cost_per_token": 1e-05,
"output_cost_per_token_flex": 5e-06,
"output_cost_per_token_priority": 2e-05,
@@ -14882,7 +15442,7 @@
"max_input_tokens": 272000,
"max_output_tokens": 128000,
"max_tokens": 128000,
- "mode": "responses",
+ "mode": "chat",
"output_cost_per_token": 1e-05,
"supported_endpoints": [
"/v1/chat/completions",
@@ -14914,7 +15474,7 @@
"max_input_tokens": 128000,
"max_output_tokens": 16384,
"max_tokens": 16384,
- "mode": "responses",
+ "mode": "chat",
"output_cost_per_token": 1e-05,
"supported_endpoints": [
"/v1/chat/completions",
@@ -15046,7 +15606,7 @@
"max_input_tokens": 272000,
"max_output_tokens": 128000,
"max_tokens": 128000,
- "mode": "responses",
+ "mode": "chat",
"output_cost_per_token": 2e-06,
"output_cost_per_token_flex": 1e-06,
"output_cost_per_token_priority": 3.6e-06,
@@ -15085,7 +15645,7 @@
"max_input_tokens": 272000,
"max_output_tokens": 128000,
"max_tokens": 128000,
- "mode": "responses",
+ "mode": "chat",
"output_cost_per_token": 2e-06,
"output_cost_per_token_flex": 1e-06,
"output_cost_per_token_priority": 3.6e-06,
@@ -15123,7 +15683,7 @@
"max_input_tokens": 272000,
"max_output_tokens": 128000,
"max_tokens": 128000,
- "mode": "responses",
+ "mode": "chat",
"output_cost_per_token": 4e-07,
"output_cost_per_token_flex": 2e-07,
"supported_endpoints": [
@@ -15158,7 +15718,7 @@
"max_input_tokens": 272000,
"max_output_tokens": 128000,
"max_tokens": 128000,
- "mode": "responses",
+ "mode": "chat",
"output_cost_per_token": 4e-07,
"output_cost_per_token_flex": 2e-07,
"supported_endpoints": [
@@ -18011,7 +18571,7 @@
"max_input_tokens": 200000,
"max_output_tokens": 100000,
"max_tokens": 100000,
- "mode": "responses",
+ "mode": "chat",
"output_cost_per_token": 6e-05,
"supports_function_calling": true,
"supports_parallel_function_calling": true,
@@ -18030,7 +18590,7 @@
"max_input_tokens": 128000,
"max_output_tokens": 65536,
"max_tokens": 65536,
- "mode": "responses",
+ "mode": "chat",
"output_cost_per_token": 4.4e-06,
"supports_pdf_input": true,
"supports_prompt_caching": true,
@@ -18044,7 +18604,7 @@
"max_input_tokens": 128000,
"max_output_tokens": 65536,
"max_tokens": 65536,
- "mode": "responses",
+ "mode": "chat",
"output_cost_per_token": 1.2e-05,
"supports_pdf_input": true,
"supports_prompt_caching": true,
@@ -18058,7 +18618,7 @@
"max_input_tokens": 128000,
"max_output_tokens": 32768,
"max_tokens": 32768,
- "mode": "responses",
+ "mode": "chat",
"output_cost_per_token": 6e-05,
"supports_pdf_input": true,
"supports_prompt_caching": true,
@@ -18072,7 +18632,7 @@
"max_input_tokens": 128000,
"max_output_tokens": 32768,
"max_tokens": 32768,
- "mode": "responses",
+ "mode": "chat",
"output_cost_per_token": 6e-05,
"supports_pdf_input": true,
"supports_prompt_caching": true,
@@ -18188,7 +18748,7 @@
"max_input_tokens": 200000,
"max_output_tokens": 100000,
"max_tokens": 100000,
- "mode": "responses",
+ "mode": "chat",
"output_cost_per_token": 8e-06,
"supported_endpoints": [
"/v1/responses",
@@ -18286,7 +18846,7 @@
"max_input_tokens": 200000,
"max_output_tokens": 100000,
"max_tokens": 100000,
- "mode": "responses",
+ "mode": "chat",
"output_cost_per_token": 4.4e-06,
"supports_function_calling": true,
"supports_parallel_function_calling": false,
@@ -18303,7 +18863,7 @@
"max_input_tokens": 200000,
"max_output_tokens": 100000,
"max_tokens": 100000,
- "mode": "responses",
+ "mode": "chat",
"output_cost_per_token": 4.4e-06,
"supports_function_calling": true,
"supports_parallel_function_calling": false,
@@ -23942,6 +24502,12 @@
"supports_reasoning": true,
"supports_tool_choice": true
},
+ "vertex_ai/gemini-2.5-flash-image": {
+ "litellm_provider": "vertex_ai-language-models",
+ "mode": "image_generation",
+ "output_cost_per_image": 0.039,
+ "source": "https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/image-generation#edit-an-image"
+ },
"vertex_ai/imagegeneration@006": {
"litellm_provider": "vertex_ai-image-models",
"mode": "image_generation",
@@ -23966,6 +24532,12 @@
"output_cost_per_image": 0.04,
"source": "https://cloud.google.com/vertex-ai/generative-ai/pricing"
},
+ "vertex_ai/imagen-3.0-capability-001": {
+ "litellm_provider": "vertex_ai-image-models",
+ "mode": "image_generation",
+ "output_cost_per_image": 0.04,
+ "source": "https://cloud.google.com/vertex-ai/generative-ai/docs/image/edit-insert-objects"
+ },
"vertex_ai/imagen-4.0-fast-generate-001": {
"litellm_provider": "vertex_ai-image-models",
"mode": "image_generation",
diff --git a/aider/website/assets/sample-analytics.jsonl b/aider/website/assets/sample-analytics.jsonl
index 1de228a2ff6..22fe1a3f9f2 100644
--- a/aider/website/assets/sample-analytics.jsonl
+++ b/aider/website/assets/sample-analytics.jsonl
@@ -1,95 +1,3 @@
-{"event": "repo", "properties": {"num_files": 630}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754574105}
-{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754574105}
-{"event": "exit", "properties": {"reason": "Unknown edit format"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754574105}
-{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754574115}
-{"event": "gui session", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754574115}
-{"event": "exit", "properties": {"reason": "GUI session ended"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754574115}
-{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754590866}
-{"event": "model warning", "properties": {"main_model": "openai/REDACTED", "weak_model": "gemini/gemini-2.5-flash", "editor_model": "openai/REDACTED"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754590867}
-{"event": "exit", "properties": {"reason": "Keyboard interrupt during model warnings"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754590870}
-{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754590875}
-{"event": "repo", "properties": {"num_files": 630}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754590875}
-{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754590875}
-{"event": "message_send_starting", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754590875}
-{"event": "exit", "properties": {"reason": "Completed --message"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754590877}
-{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754590946}
-{"event": "repo", "properties": {"num_files": 630}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754590946}
-{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754590946}
-{"event": "cli session", "properties": {"main_model": "gemini/gemini-2.5-pro", "weak_model": "gemini/gemini-2.5-flash", "editor_model": "gemini/gemini-2.5-pro", "edit_format": "diff-fenced"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754590946}
-{"event": "message_send_starting", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754590949}
-{"event": "message_send", "properties": {"main_model": "gemini/gemini-2.5-pro", "weak_model": "gemini/gemini-2.5-flash", "editor_model": "gemini/gemini-2.5-pro", "edit_format": "diff-fenced", "prompt_tokens": 18114, "completion_tokens": 162, "total_tokens": 18276, "cost": 0.024262500000000003, "total_cost": 0.024262500000000003}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754590997}
-{"event": "message_send_starting", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591023}
-{"event": "message_send_starting", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591043}
-{"event": "command_undo", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591057}
-{"event": "command_clear", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591059}
-{"event": "command_edit", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591065}
-{"event": "message_send_starting", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591098}
-{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591132}
-{"event": "repo", "properties": {"num_files": 630}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591133}
-{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591133}
-{"event": "message_send_starting", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591133}
-{"event": "exit", "properties": {"reason": "Completed --message"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591134}
-{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591138}
-{"event": "repo", "properties": {"num_files": 630}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591138}
-{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591138}
-{"event": "cli session", "properties": {"main_model": "gemini/gemini-2.5-pro", "weak_model": "gemini/gemini-2.5-flash", "editor_model": "gemini/gemini-2.5-pro", "edit_format": "diff-fenced"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591138}
-{"event": "command_add", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591140}
-{"event": "message_send_starting", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591152}
-{"event": "message_send", "properties": {"main_model": "gemini/gemini-2.5-pro", "weak_model": "gemini/gemini-2.5-flash", "editor_model": "gemini/gemini-2.5-pro", "edit_format": "diff-fenced", "prompt_tokens": 18172, "completion_tokens": 134, "total_tokens": 18306, "cost": 0.024055000000000003, "total_cost": 0.048317500000000006}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591160}
-{"event": "exit", "properties": {"reason": "Completed main CLI coder.run"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591177}
-{"event": "message_send", "properties": {"main_model": "gemini/gemini-2.5-pro", "weak_model": "gemini/gemini-2.5-flash", "editor_model": "gemini/gemini-2.5-pro", "edit_format": "diff-fenced", "prompt_tokens": 16478, "completion_tokens": 218, "total_tokens": 16696, "cost": 0.022777500000000003, "total_cost": 0.022777500000000003}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591178}
-{"event": "command_reset", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591254}
-{"event": "command_add", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591256}
-{"event": "command_ask", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591262}
-{"event": "message_send_starting", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591266}
-{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591324}
-{"event": "repo", "properties": {"num_files": 630}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591327}
-{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591327}
-{"event": "message_send_starting", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591328}
-{"event": "message_send", "properties": {"main_model": "gemini/gemini-2.5-pro", "weak_model": "gemini/gemini-2.5-flash", "editor_model": "gemini/gemini-2.5-pro", "edit_format": "ask", "prompt_tokens": 6356, "completion_tokens": 143, "total_tokens": 6499, "cost": 0.009375000000000001, "total_cost": 0.0321525}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591336}
-{"event": "message_send", "properties": {"main_model": "openai/REDACTED", "weak_model": "gemini/gemini-2.5-flash", "editor_model": "openai/REDACTED", "edit_format": "whole", "prompt_tokens": 1919, "completion_tokens": 51, "total_tokens": 1970, "cost": 0, "total_cost": 0.0}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591337}
-{"event": "exit", "properties": {"reason": "Completed --message"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591337}
-{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591351}
-{"event": "no-repo", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591352}
-{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591352}
-{"event": "message_send_starting", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591352}
-{"event": "exit", "properties": {"reason": "Completed main CLI coder.run"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591358}
-{"event": "message_send", "properties": {"main_model": "o3", "weak_model": "gemini/gemini-2.5-flash", "editor_model": "gpt-4.1", "edit_format": "diff", "prompt_tokens": 3372, "completion_tokens": 543, "total_tokens": 3915, "cost": 0.011088, "total_cost": 0.011088}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591367}
-{"event": "exit", "properties": {"reason": "Completed --message"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591367}
-{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591422}
-{"event": "model warning", "properties": {"main_model": "openai/REDACTED", "weak_model": "gemini/gemini-2.5-flash", "editor_model": "openai/REDACTED"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591423}
-{"event": "repo", "properties": {"num_files": 630}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591430}
-{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591430}
-{"event": "message_send_starting", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591430}
-{"event": "exit", "properties": {"reason": "Completed --message"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591432}
-{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591443}
-{"event": "repo", "properties": {"num_files": 630}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591443}
-{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591443}
-{"event": "cli session", "properties": {"main_model": "openai/REDACTED", "weak_model": "gemini/gemini-2.5-flash", "editor_model": "openai/REDACTED", "edit_format": "whole"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591443}
-{"event": "exit", "properties": {"reason": "Completed main CLI coder.run"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591460}
-{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591464}
-{"event": "repo", "properties": {"num_files": 630}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591465}
-{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591465}
-{"event": "cli session", "properties": {"main_model": "None", "weak_model": "gemini/gemini-2.5-flash", "editor_model": "None", "edit_format": "whole"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591465}
-{"event": "message_send_starting", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591467}
-{"event": "exit", "properties": {"reason": "Completed main CLI coder.run"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591500}
-{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591509}
-{"event": "model warning", "properties": {"main_model": "None", "weak_model": "None", "editor_model": "None"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591513}
-{"event": "no-repo", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591513}
-{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591513}
-{"event": "exit", "properties": {"reason": "Unknown edit format"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591513}
-{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591513}
-{"event": "model warning", "properties": {"main_model": "None", "weak_model": "None", "editor_model": "None"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591513}
-{"event": "no-repo", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591513}
-{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591513}
-{"event": "exit", "properties": {"reason": "Unknown edit format"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591513}
-{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591513}
-{"event": "model warning", "properties": {"main_model": "None", "weak_model": "None", "editor_model": "None"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591513}
-{"event": "no-repo", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591513}
-{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591513}
-{"event": "exit", "properties": {"reason": "Unknown edit format"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591513}
-{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591513}
-{"event": "model warning", "properties": {"main_model": "None", "weak_model": "None", "editor_model": "None"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591513}
{"event": "no-repo", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591513}
{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591513}
{"event": "exit", "properties": {"reason": "Unknown edit format"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1754591513}
@@ -998,3 +906,95 @@
{"event": "repo", "properties": {"num_files": 633}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1759167000}
{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1759167000}
{"event": "exit", "properties": {"reason": "Completed lint/test/commit"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1759167001}
+{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666075}
+{"event": "model warning", "properties": {"main_model": "None", "weak_model": "None", "editor_model": "None"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "no-repo", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "exit", "properties": {"reason": "Unknown edit format"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "model warning", "properties": {"main_model": "None", "weak_model": "None", "editor_model": "None"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "no-repo", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "exit", "properties": {"reason": "Unknown edit format"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "model warning", "properties": {"main_model": "None", "weak_model": "None", "editor_model": "None"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "no-repo", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "exit", "properties": {"reason": "Unknown edit format"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "model warning", "properties": {"main_model": "None", "weak_model": "None", "editor_model": "None"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "no-repo", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "exit", "properties": {"reason": "Unknown edit format"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "model warning", "properties": {"main_model": "None", "weak_model": "None", "editor_model": "None"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "no-repo", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "exit", "properties": {"reason": "Unknown edit format"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666076}
+{"event": "model warning", "properties": {"main_model": "None", "weak_model": "None", "editor_model": "None"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666077}
+{"event": "no-repo", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666077}
+{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666077}
+{"event": "exit", "properties": {"reason": "Unknown edit format"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666077}
+{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666077}
+{"event": "model warning", "properties": {"main_model": "None", "weak_model": "None", "editor_model": "None"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666077}
+{"event": "no-repo", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666077}
+{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666077}
+{"event": "exit", "properties": {"reason": "Unknown edit format"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666077}
+{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666077}
+{"event": "model warning", "properties": {"main_model": "None", "weak_model": "None", "editor_model": "None"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666077}
+{"event": "no-repo", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666077}
+{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666077}
+{"event": "exit", "properties": {"reason": "Unknown edit format"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666077}
+{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666077}
+{"event": "model warning", "properties": {"main_model": "None", "weak_model": "None", "editor_model": "None"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666077}
+{"event": "no-repo", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666077}
+{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666077}
+{"event": "exit", "properties": {"reason": "Unknown edit format"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666077}
+{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666077}
+{"event": "model warning", "properties": {"main_model": "None", "weak_model": "None", "editor_model": "None"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666078}
+{"event": "no-repo", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666078}
+{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666078}
+{"event": "exit", "properties": {"reason": "Unknown edit format"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666078}
+{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666078}
+{"event": "model warning", "properties": {"main_model": "None", "weak_model": "None", "editor_model": "None"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666078}
+{"event": "no-repo", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666078}
+{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666078}
+{"event": "exit", "properties": {"reason": "Unknown edit format"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666078}
+{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666078}
+{"event": "model warning", "properties": {"main_model": "None", "weak_model": "None", "editor_model": "None"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666078}
+{"event": "no-repo", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666078}
+{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666078}
+{"event": "exit", "properties": {"reason": "Unknown edit format"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666078}
+{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666078}
+{"event": "model warning", "properties": {"main_model": "None", "weak_model": "None", "editor_model": "None"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666078}
+{"event": "no-repo", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666078}
+{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666078}
+{"event": "exit", "properties": {"reason": "Unknown edit format"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666078}
+{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666078}
+{"event": "model warning", "properties": {"main_model": "None", "weak_model": "None", "editor_model": "None"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666079}
+{"event": "no-repo", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666079}
+{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666079}
+{"event": "exit", "properties": {"reason": "Unknown edit format"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666079}
+{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666079}
+{"event": "model warning", "properties": {"main_model": "None", "weak_model": "None", "editor_model": "None"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666079}
+{"event": "no-repo", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666079}
+{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666079}
+{"event": "exit", "properties": {"reason": "Unknown edit format"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666079}
+{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666079}
+{"event": "model warning", "properties": {"main_model": "None", "weak_model": "None", "editor_model": "None"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666079}
+{"event": "no-repo", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666079}
+{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666079}
+{"event": "exit", "properties": {"reason": "Unknown edit format"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666079}
+{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666108}
+{"event": "repo", "properties": {"num_files": 635}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666109}
+{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666109}
+{"event": "exit", "properties": {"reason": "Exit flag set"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666109}
+{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666109}
+{"event": "model warning", "properties": {"main_model": "None", "weak_model": "None", "editor_model": "None"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666109}
+{"event": "repo", "properties": {"num_files": 635}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666109}
+{"event": "auto_commits", "properties": {"enabled": true}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666109}
+{"event": "exit", "properties": {"reason": "Unknown edit format"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666109}
+{"event": "launched", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666132}
+{"event": "gui session", "properties": {}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666132}
+{"event": "exit", "properties": {"reason": "GUI session ended"}, "user_id": "c42c4e6b-f054-44d7-ae1f-6726cc41da88", "time": 1763666132}
diff --git a/aider/website/docs/config/adv-model-settings.md b/aider/website/docs/config/adv-model-settings.md
index 591e207b6c1..6baa137a5d9 100644
--- a/aider/website/docs/config/adv-model-settings.md
+++ b/aider/website/docs/config/adv-model-settings.md
@@ -378,6 +378,50 @@ cog.out("```\n")
accepts_settings:
- reasoning_effort
+- name: azure/gpt-5-pro
+ edit_format: diff
+ weak_model_name: azure/gpt-5-mini
+ use_repo_map: true
+ examples_as_sys_msg: true
+ streaming: false
+ editor_model_name: azure/gpt-5
+ editor_edit_format: editor-diff
+ system_prompt_prefix: 'Formatting re-enabled. '
+ accepts_settings:
+ - reasoning_effort
+
+- name: azure/gpt-5.1
+ edit_format: diff
+ weak_model_name: azure/gpt-5-nano
+ use_repo_map: true
+ use_temperature: false
+ accepts_settings:
+ - reasoning_effort
+
+- name: azure/gpt-5.1-2025-11-13
+ edit_format: diff
+ weak_model_name: azure/gpt-5-nano-2025-08-07
+ use_repo_map: true
+ use_temperature: false
+ accepts_settings:
+ - reasoning_effort
+
+- name: azure/gpt-5.1-chat
+ edit_format: diff
+ weak_model_name: azure/gpt-5-nano
+ use_repo_map: true
+ use_temperature: false
+ accepts_settings:
+ - reasoning_effort
+
+- name: azure/gpt-5.1-chat-latest
+ edit_format: diff
+ weak_model_name: azure/gpt-5-nano
+ use_repo_map: true
+ use_temperature: false
+ accepts_settings:
+ - reasoning_effort
+
- name: azure/o1
edit_format: diff
weak_model_name: azure/gpt-4o-mini
@@ -553,6 +597,20 @@ cog.out("```\n")
accepts_settings:
- thinking_tokens
+- name: bedrock/global.anthropic.claude-sonnet-4-5-20250929-v1:0
+ edit_format: diff
+ weak_model_name: bedrock/anthropic.claude-3-5-haiku-20241022-v1:0
+ use_repo_map: true
+ extra_params:
+ extra_headers:
+ anthropic-beta: prompt-caching-2024-07-31,pdfs-2024-09-25,output-128k-2025-02-19
+ max_tokens: 64000
+ cache_control: true
+ editor_model_name: bedrock/global.anthropic.claude-sonnet-4-5-20250929-v1:0
+ editor_edit_format: editor-diff
+ accepts_settings:
+ - thinking_tokens
+
- name: bedrock/us.anthropic.claude-3-7-sonnet-20250219-v1:0
edit_format: diff
weak_model_name: bedrock/us.anthropic.claude-3-5-haiku-20241022-v1:0
@@ -1027,6 +1085,15 @@ cog.out("```\n")
accepts_settings:
- thinking_tokens
+- name: gemini/gemini-3-pro-preview
+ edit_format: diff-fenced
+ weak_model_name: gemini/gemini-2.5-flash
+ use_repo_map: true
+ overeager: true
+ use_temperature: false
+ accepts_settings:
+ - thinking_tokens
+
- name: gemini/gemini-exp-1206
edit_format: diff
use_repo_map: true
@@ -1193,6 +1260,14 @@ cog.out("```\n")
accepts_settings:
- reasoning_effort
+- name: gpt-5-codex
+ edit_format: diff
+ weak_model_name: gpt-5-nano
+ use_repo_map: true
+ use_temperature: false
+ accepts_settings:
+ - reasoning_effort
+
- name: gpt-5-mini
edit_format: diff
weak_model_name: gpt-5-nano
@@ -1225,6 +1300,59 @@ cog.out("```\n")
accepts_settings:
- reasoning_effort
+- name: gpt-5-pro
+ edit_format: diff
+ weak_model_name: gpt-5-mini
+ use_repo_map: true
+ examples_as_sys_msg: true
+ streaming: false
+ editor_model_name: gpt-5
+ editor_edit_format: editor-diff
+ system_prompt_prefix: 'Formatting re-enabled. '
+ accepts_settings:
+ - reasoning_effort
+
+- name: gpt-5.1
+ edit_format: diff
+ weak_model_name: gpt-5-nano
+ use_repo_map: true
+ overeager: true
+ use_temperature: false
+ accepts_settings:
+ - reasoning_effort
+
+- name: gpt-5.1-2025-11-13
+ edit_format: diff
+ weak_model_name: gpt-5-nano-2025-08-07
+ use_repo_map: true
+ use_temperature: false
+ accepts_settings:
+ - reasoning_effort
+
+- name: gpt-5.1-chat
+ edit_format: diff
+ weak_model_name: gpt-5-nano
+ use_repo_map: true
+ use_temperature: false
+ accepts_settings:
+ - reasoning_effort
+
+- name: gpt-5.1-chat-latest
+ edit_format: diff
+ weak_model_name: gpt-5-nano
+ use_repo_map: true
+ use_temperature: false
+ accepts_settings:
+ - reasoning_effort
+
+- name: gpt-5.1-codex
+ edit_format: diff
+ weak_model_name: gpt-5-nano
+ use_repo_map: true
+ use_temperature: false
+ accepts_settings:
+ - reasoning_effort
+
- name: groq/llama3-70b-8192
edit_format: diff
weak_model_name: groq/llama3-8b-8192
@@ -1434,6 +1562,50 @@ cog.out("```\n")
accepts_settings:
- reasoning_effort
+- name: openai/gpt-5-pro
+ edit_format: diff
+ weak_model_name: openai/gpt-5-mini
+ use_repo_map: true
+ examples_as_sys_msg: true
+ streaming: false
+ editor_model_name: openai/gpt-5
+ editor_edit_format: editor-diff
+ system_prompt_prefix: 'Formatting re-enabled. '
+ accepts_settings:
+ - reasoning_effort
+
+- name: openai/gpt-5.1
+ edit_format: diff
+ weak_model_name: openai/gpt-5-nano
+ use_repo_map: true
+ use_temperature: false
+ accepts_settings:
+ - reasoning_effort
+
+- name: openai/gpt-5.1-2025-11-13
+ edit_format: diff
+ weak_model_name: openai/gpt-5-nano-2025-08-07
+ use_repo_map: true
+ use_temperature: false
+ accepts_settings:
+ - reasoning_effort
+
+- name: openai/gpt-5.1-chat
+ edit_format: diff
+ weak_model_name: openai/gpt-5-nano
+ use_repo_map: true
+ use_temperature: false
+ accepts_settings:
+ - reasoning_effort
+
+- name: openai/gpt-5.1-chat-latest
+ edit_format: diff
+ weak_model_name: openai/gpt-5-nano
+ use_repo_map: true
+ use_temperature: false
+ accepts_settings:
+ - reasoning_effort
+
- name: openai/o1
edit_format: diff
weak_model_name: openai/gpt-4o-mini
@@ -1758,6 +1930,14 @@ cog.out("```\n")
accepts_settings:
- thinking_tokens
+- name: openrouter/google/gemini-3-pro-preview
+ edit_format: diff-fenced
+ weak_model_name: openrouter/google/gemini-2.5-flash
+ use_repo_map: true
+ overeager: true
+ accepts_settings:
+ - thinking_tokens
+
- name: openrouter/google/gemma-3-27b-it
use_system_prompt: false
@@ -1861,6 +2041,50 @@ cog.out("```\n")
accepts_settings:
- reasoning_effort
+- name: openrouter/openai/gpt-5-pro
+ edit_format: diff
+ weak_model_name: openrouter/openai/gpt-5-mini
+ use_repo_map: true
+ examples_as_sys_msg: true
+ streaming: false
+ editor_model_name: openrouter/openai/gpt-5
+ editor_edit_format: editor-diff
+ system_prompt_prefix: 'Formatting re-enabled. '
+ accepts_settings:
+ - reasoning_effort
+
+- name: openrouter/openai/gpt-5.1
+ edit_format: diff
+ weak_model_name: openrouter/openai/gpt-5-nano
+ use_repo_map: true
+ use_temperature: false
+ accepts_settings:
+ - reasoning_effort
+
+- name: openrouter/openai/gpt-5.1-2025-11-13
+ edit_format: diff
+ weak_model_name: openrouter/openai/gpt-5-nano-2025-08-07
+ use_repo_map: true
+ use_temperature: false
+ accepts_settings:
+ - reasoning_effort
+
+- name: openrouter/openai/gpt-5.1-chat
+ edit_format: diff
+ weak_model_name: openrouter/openai/gpt-5-nano
+ use_repo_map: true
+ use_temperature: false
+ accepts_settings:
+ - reasoning_effort
+
+- name: openrouter/openai/gpt-5.1-chat-latest
+ edit_format: diff
+ weak_model_name: openrouter/openai/gpt-5-nano
+ use_repo_map: true
+ use_temperature: false
+ accepts_settings:
+ - reasoning_effort
+
- name: openrouter/openai/o1
edit_format: diff
weak_model_name: openrouter/openai/gpt-4o-mini
@@ -2232,6 +2456,15 @@ cog.out("```\n")
accepts_settings:
- thinking_tokens
+- name: vertex_ai/gemini-3-pro-preview
+ edit_format: diff-fenced
+ weak_model_name: vertex_ai/gemini-2.5-flash
+ use_repo_map: true
+ overeager: true
+ editor_model_name: vertex_ai/gemini-2.5-flash
+ accepts_settings:
+ - thinking_tokens
+
- name: xai/grok-3-beta
edit_format: diff
use_repo_map: true
diff --git a/aider/website/docs/config/model-aliases.md b/aider/website/docs/config/model-aliases.md
index 7dbc824fed1..c27b34da002 100644
--- a/aider/website/docs/config/model-aliases.md
+++ b/aider/website/docs/config/model-aliases.md
@@ -81,8 +81,9 @@ for alias, model in sorted(MODEL_ALIASES.items()):
- `deepseek`: deepseek/deepseek-chat
- `flash`: gemini/gemini-2.5-flash
- `flash-lite`: gemini/gemini-2.5-flash-lite
-- `gemini`: gemini/gemini-2.5-pro
+- `gemini`: gemini/gemini-3-pro-preview
- `gemini-2.5-pro`: gemini/gemini-2.5-pro
+- `gemini-3-pro-preview`: gemini/gemini-3-pro-preview
- `gemini-exp`: gemini/gemini-2.5-pro-exp-03-25
- `grok3`: xai/grok-3-beta
- `haiku`: claude-3-5-haiku-20241022
diff --git a/aider/website/docs/faq.md b/aider/website/docs/faq.md
index d861e76591d..73970e981ed 100644
--- a/aider/website/docs/faq.md
+++ b/aider/website/docs/faq.md
@@ -264,19 +264,13 @@ tr:hover { background-color: #f5f5f5; }
| Model Name | Total Tokens | Percent |
|---|---|---|
| gemini/gemini-2.5-pro | 281,824 | 38.5% |
| gpt-5 | 211,072 | 28.9% |
| None | 168,988 | 23.1% |
| o3-pro | 36,620 | 5.0% |
| gemini/gemini-2.5-flash-lite | 15,470 | 2.1% |
| gemini/gemini-2.5-flash-lite-preview-06-17 | 11,371 | 1.6% |
| o3 | 3,915 | 0.5% |
| openai/REDACTED | 1,970 | 0.3% |
| gemini/gemini-2.5-pro | 222,047 | 33.4% |
| gpt-5 | 211,072 | 31.7% |
| None | 168,988 | 25.4% |
| o3-pro | 36,620 | 5.5% |
| gemini/gemini-2.5-flash-lite | 15,470 | 2.3% |
| gemini/gemini-2.5-flash-lite-preview-06-17 | 11,371 | 1.7% |