Skip to content

fix(acp): don't fail session creation when model listing is unavailable#7484

Merged
katzdave merged 1 commit intoblock:mainfrom
idosavion:idosavion/goose-model-validation
Feb 26, 2026
Merged

fix(acp): don't fail session creation when model listing is unavailable#7484
katzdave merged 1 commit intoblock:mainfrom
idosavion:idosavion/goose-model-validation

Conversation

@idosavion
Copy link
Contributor

@idosavion idosavion commented Feb 24, 2026

Summary

Fixes #7427

build_model_state treats model listing as a hard gate on session creation — if the GET to the models endpoint fails, on_new_session returns an error and the session never starts. This breaks proxy-based and deployment-specific providers that only implement the chat completions endpoint (no /models route).

Now build_model_state catches the error, logs a warning, and returns an empty model list. The session starts normally; model selection in the UI just won't be available. This matches providers like Azure and xAI that already return empty model lists through the default fetch_supported_models impl.

Type of Change

  • Bug fix
  • Tests

AI Assistance

  • This PR was created or reviewed with AI assistance

Testing

  • test_build_model_state::fetch_error_falls_back_to_current_model_only — previously asserted Err(_), now asserts the function returns the current model with an empty available-models list
  • cargo fmt, cargo clippy -D warnings clean

Related Issues

Follow-up to #7112 (model selection support). That PR added build_model_state but made it fatal, so any provider without a models endpoint can't start sessions.

Copilot AI review requested due to automatic review settings February 24, 2026 18:19
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR fixes a bug where ACP session creation fails silently when using proxy-based providers that don't implement the OpenAI-compatible /models endpoint. The fix changes build_model_state to gracefully degrade by logging a warning and returning an empty model list instead of propagating the error, allowing sessions to start normally with model selection unavailable.

Changes:

  • Modified build_model_state to handle fetch_recommended_models errors gracefully instead of propagating them
  • Updated function signature to return SessionModelState directly instead of Result<SessionModelState, sacp::Error>
  • Updated test case from "fetch error propagates" to "fetch error falls back to current model only" with correct assertions

@idosavion idosavion force-pushed the idosavion/goose-model-validation branch from f09689a to 0374d72 Compare February 24, 2026 18:39
@idosavion idosavion marked this pull request as draft February 24, 2026 18:40
@idosavion idosavion marked this pull request as ready for review February 24, 2026 18:54
Copilot AI review requested due to automatic review settings February 24, 2026 18:54
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 1 out of 1 changed files in this pull request and generated no new comments.

Proxy-based providers that only implement the chat completions POST
endpoint (and not a GET /models endpoint) caused session creation to
fail because build_model_state propagated the fetch error via `?`.

Model listing is optional UI metadata for the model-switcher. It should
not be a hard gate on session creation. This aligns with providers like
Azure and xAI that already return empty model lists successfully.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Signed-off-by: Ido Savion <ido@diversion.dev>
@idosavion idosavion force-pushed the idosavion/goose-model-validation branch from 0374d72 to 91f354e Compare February 24, 2026 19:03
@idosavion
Copy link
Contributor Author

For sanity, built the custom binary and verified the log is emitted as expected and session is starting:

{
  "level": "WARN",
  "message": "failed to fetch models, model selection will be unavailable",
  "error": "Request failed: Resource not found (404): ",
  "target": "goose_acp::server"
}

@idosavion
Copy link
Contributor Author

@katzdave - thanks for the quick review!
Anything else I need to do in order to get this merged and deployed as part of the next path/minor version?

@katzdave
Copy link
Collaborator

@katzdave - thanks for the quick review! Anything else I need to do in order to get this merged and deployed as part of the next path/minor version?

Nope sorry let's merge!

@katzdave katzdave added this pull request to the merge queue Feb 26, 2026
Merged via the queue into block:main with commit b7bb6b4 Feb 26, 2026
20 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Cannot use Azure OpenAI deployment endpoints (no /models endpoint)

3 participants