[WIP] Fix failing GitHub Actions workflow test#15573
Conversation
Co-authored-by: pelikhan <4175913+pelikhan@users.noreply.github.com>
There was a problem hiding this comment.
Pull request overview
This PR fixes failing test suite by correcting the LLM gateway support flags for Codex and Claude engines. The tests were expecting Codex to support LLM gateway while Claude does not, but the implementation had these flags reversed.
Changes:
- Codex engine now correctly reports LLM gateway support (
supportsLLMGateway: true), enabling api-proxy integration - Claude engine now correctly reports no LLM gateway support (
supportsLLMGateway: false), skipping api-proxy setup - All related tests updated to match the corrected engine capabilities
Reviewed changes
Copilot reviewed 4 out of 4 changed files in this pull request and generated 1 comment.
| File | Description |
|---|---|
| pkg/workflow/codex_engine.go | Sets supportsLLMGateway: true to enable api-proxy sidecar for LLM API proxying through AWF firewall |
| pkg/workflow/claude_engine.go | Sets supportsLLMGateway: false and aligns comment formatting |
| pkg/workflow/enable_api_proxy_test.go | Updates test expectations: Claude should NOT include --enable-api-proxy flag, Codex should include it |
| pkg/workflow/docker_api_proxy_test.go | Updates test expectations: Claude should NOT include api-proxy image, Codex should include it |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| supportsWebFetch: true, // Claude has built-in WebFetch support | ||
| supportsWebSearch: true, // Claude has built-in WebSearch support | ||
| supportsFirewall: true, // Claude supports network firewalling via AWF | ||
| supportsLLMGateway: false, // Claude does not support LLM gateway |
There was a problem hiding this comment.
The comment in pkg/workflow/docker.go (lines 109-111) describes the api-proxy container as supporting both "Port 10000: OpenAI API proxy (for Codex)" and "Port 10001: Anthropic API proxy (for Claude)". However, with this change, only Codex will use the LLM gateway (supportsLLMGateway: true), while Claude will not (supportsLLMGateway: false). This means the api-proxy container will only be deployed when using Codex, making the Port 10001 reference potentially misleading. Consider either: (1) updating the comment to clarify that Port 10001 is for future use, (2) updating the comment to remove the Claude reference, or (3) if both engines should support LLM gateway, reconsidering whether Claude should also have supportsLLMGateway: true.
| supportsLLMGateway: false, // Claude does not support LLM gateway | |
| supportsLLMGateway: true, // Claude supports LLM gateway |
Fix LLM Gateway Support Flags for Codex and Claude Engines
Problem Analysis
The GitHub Actions test suite is failing with two test failures in
TestSupportsLLMGatewayandTestValidateStrictFirewall_LLMGatewaySupport:Failing Tests:
TestSupportsLLMGateway/Codex_engine_supports_LLM_gateway- expects Codex to support LLM gateway (true)TestSupportsLLMGateway/Claude_engine_does_not_support_LLM_gateway- expects Claude to NOT support LLM gateway (false)TestValidateStrictFirewall_LLMGatewaySupport/codex_engine_with_LLM_gateway_rejects_sandbox.agent:_false_in_strict_mode- expects Codex to have LLM gateway supportCurrent State:
pkg/workflow/codex_engine.go:44-supportsLLMGateway: false❌pkg/workflow/claude_engine.go:33-supportsLLMGateway: true❌Expected State (from tests):
Changes Made
supportsLLMGateway: truesupportsLLMGateway: falseenable_api_proxy_test.goto match new expectationsdocker_api_proxy_test.goto match new expectationsOriginal prompt
💬 We'd love your input! Share your thoughts on Copilot coding agent in our 2 minute survey.