Conversation
Greptile SummaryAdds a compact routing-summary function to the provider attribution module and wires it into the embedded-runner debug log at prompt start. Two private classification helpers compute a readable label string; tests cover four routing shapes (custom, local, native, and OpenRouter). Confidence Score: 5/5Safe to merge; all findings are minor style/P2 concerns with no correctness or data-integrity issues. The change is a pure logging/diagnostic addition — no runtime behavior is altered. The three P2 comments cover a potentially misleading proxy-like label for known native vendors, an ambiguous log-format convention, and missing test coverage for two edge-case route classes. None block correctness or production safety. No files require special attention. Prompt To Fix All With AIThis is a comment left during a code review.
Path: src/agents/provider-attribution.ts
Line: 638-654
Comment:
**"proxy-like" label applied to non-OpenAI native vendors**
`usesExplicitProxyLikeEndpoint` is `true` for any configured base URL that isn't one of the three OpenAI-family native endpoints (`openai-public`, `openai-codex`, `azure-openai`). This means known native vendor hosts — Groq (`api.groq.com`), Anthropic (`api.anthropic.com`), xAI, Mistral, etc. — all surface as `route=proxy-like` in the new log line, even though the `endpoint=groq-native` / `endpoint=anthropic-public` field correctly identifies them as native.
The mismatch can trip up log readers: `endpoint=groq-native route=proxy-like` looks contradictory. Since this is baked into the existing `usesExplicitProxyLikeEndpoint` semantics, the fix belongs in `describeProviderRequestRouteClass` — either use the broader `isKnownNativeEndpoint` signal, or add a comment explaining why only OpenAI-family endpoints count as "native" in this routing context.
How can I resolve this? If you propose a fix, please make it concise.
---
This is a comment left during a code review.
Path: src/agents/pi-embedded-runner/run/attempt.ts
Line: 1734-1737
Comment:
**Log format nests multi-key string under a single `routing=` prefix**
The current output looks like `routing=provider=openai api=openai-responses ...`, which is ambiguous for log aggregators (Loki, Datadog, grep) that split on the first `=`. Inlining the fields directly at the top level is the more machine-friendly convention:
```suggestion
log.debug(
`embedded run prompt start: runId=${params.runId} sessionId=${params.sessionId} ` +
routingSummary,
);
```
How can I resolve this? If you propose a fix, please make it concise.
---
This is a comment left during a code review.
Path: src/agents/provider-attribution.test.ts
Line: 307-351
Comment:
**Missing coverage for "default" and "invalid" route classes**
The new test covers `custom` (proxy-like), `local`, `openai-public` (native), and `openrouter` (proxy-like), but the `"default"` (no `baseUrl`) and `"invalid"` (malformed `baseUrl`) branches of `describeProviderRequestRouteClass` are unexercised. Two extra assertions would close the gap:
```ts
// default — no baseUrl
expect(
describeProviderRequestRoutingSummary({ provider: "openai", api: "openai-responses" }),
).toBe("provider=openai api=openai-responses endpoint=default route=default policy=none");
// invalid — unparseable baseUrl
expect(
describeProviderRequestRoutingSummary({
provider: "openai",
api: "openai-responses",
baseUrl: "javascript:alert(1)",
}),
).toBe("provider=openai api=openai-responses endpoint=invalid route=invalid policy=none");
```
How can I resolve this? If you propose a fix, please make it concise.Reviews (1): Last reviewed commit: "Agents: log proxy route summary" | Re-trigger Greptile |
There was a problem hiding this comment.
Pull request overview
Adds a compact, human-readable routing summary for provider requests (native/local/custom/proxy-like), and includes that summary in the embedded runner’s prompt-start debug logging to improve observability of how requests are being routed.
Changes:
- Introduce
describeProviderRequestRoutingSummary()inprovider-attribution.ts. - Log the routing summary at embedded prompt start in
pi-embedded-runner. - Add unit tests validating routing summaries for common endpoint patterns (proxy-like, local, native OpenAI, OpenRouter).
Reviewed changes
Copilot reviewed 3 out of 3 changed files in this pull request and generated 3 comments.
| File | Description |
|---|---|
| src/agents/provider-attribution.ts | Adds routing summary helper + internal route/policy descriptor helpers. |
| src/agents/provider-attribution.test.ts | Adds focused assertions for routing summary output. |
| src/agents/pi-embedded-runner/run/attempt.ts | Adds routing summary to the existing prompt-start debug log. |
| function describeProviderRequestRouteClass( | ||
| policy: ProviderRequestPolicyResolution, | ||
| ): "default" | "native" | "proxy-like" | "local" | "invalid" { | ||
| if (policy.endpointClass === "default") { | ||
| return "default"; | ||
| } | ||
| if (policy.endpointClass === "invalid") { | ||
| return "invalid"; | ||
| } | ||
| if (policy.endpointClass === "local") { | ||
| return "local"; | ||
| } | ||
| if (policy.usesExplicitProxyLikeEndpoint) { | ||
| return "proxy-like"; | ||
| } | ||
| return "native"; | ||
| } |
There was a problem hiding this comment.
describeProviderRequestRouteClass() uses usesExplicitProxyLikeEndpoint to decide proxy-like vs native. However usesExplicitProxyLikeEndpoint is currently true for any non-default baseUrl that isn't a known OpenAI endpoint (see resolveProviderRequestPolicy: usesConfiguredBaseUrl && !usesKnownNativeOpenAIEndpoint), which includes known native vendor endpoints like mistral-public, xai-native, google-*, etc. This will mislabel many native routes as proxy-like in the new routing summary. Consider basing proxy-like on endpointClass (e.g. custom and openrouter) and treating all other non-default/non-local/non-invalid endpoint classes as native.
| it("summarizes proxy-like, local, and native routing compactly", () => { | ||
| expect( | ||
| describeProviderRequestRoutingSummary({ | ||
| provider: "openai", | ||
| api: "openai-responses", | ||
| baseUrl: "https://proxy.example.com/v1", | ||
| transport: "stream", | ||
| capability: "llm", | ||
| }), | ||
| ).toBe("provider=openai api=openai-responses endpoint=custom route=proxy-like policy=none"); | ||
|
|
||
| expect( | ||
| describeProviderRequestRoutingSummary({ | ||
| provider: "qwen", | ||
| api: "openai-responses", | ||
| baseUrl: "http://localhost:1234/v1", | ||
| transport: "stream", | ||
| capability: "llm", | ||
| }), | ||
| ).toBe("provider=qwen api=openai-responses endpoint=local route=local policy=none"); | ||
|
|
||
| expect( | ||
| describeProviderRequestRoutingSummary({ | ||
| provider: "openai", | ||
| api: "openai-responses", | ||
| baseUrl: "https://api.openai.com/v1", | ||
| transport: "stream", | ||
| capability: "llm", | ||
| }), | ||
| ).toBe( | ||
| "provider=openai api=openai-responses endpoint=openai-public route=native policy=hidden", | ||
| ); | ||
|
|
||
| expect( | ||
| describeProviderRequestRoutingSummary({ | ||
| provider: "openrouter", | ||
| api: "openai-responses", | ||
| baseUrl: "https://openrouter.ai/api/v1", | ||
| transport: "stream", | ||
| capability: "llm", | ||
| }), | ||
| ).toBe( | ||
| "provider=openrouter api=openai-responses endpoint=openrouter route=proxy-like policy=documented", | ||
| ); | ||
| }); |
There was a problem hiding this comment.
The new helper has a unit test for OpenAI/OpenRouter/local/custom, but it doesn’t cover other known native endpoint classes (e.g. mistral-public, xai-native, google-generative-ai, etc). Given the current route classification logic, adding at least one assertion for a non-OpenAI native endpoint would prevent regressions where native endpoints are summarized as proxy-like.
| const routingSummary = describeProviderRequestRoutingSummary({ | ||
| provider: params.provider, | ||
| api: params.model.api, | ||
| baseUrl: params.model.baseUrl, | ||
| capability: "llm", | ||
| transport: "stream", | ||
| }); | ||
| log.debug( | ||
| `embedded run prompt start: runId=${params.runId} sessionId=${params.sessionId} ` + | ||
| `routing=${routingSummary}`, | ||
| ); |
There was a problem hiding this comment.
describeProviderRequestRoutingSummary() does policy resolution work on every prompt start even when debug logs are disabled. Since this path runs per turn and the file already uses log.isEnabled("debug") for heavier debug-only computation, consider guarding the routingSummary computation and the debug log with the same check to avoid unnecessary overhead.
| const routingSummary = describeProviderRequestRoutingSummary({ | |
| provider: params.provider, | |
| api: params.model.api, | |
| baseUrl: params.model.baseUrl, | |
| capability: "llm", | |
| transport: "stream", | |
| }); | |
| log.debug( | |
| `embedded run prompt start: runId=${params.runId} sessionId=${params.sessionId} ` + | |
| `routing=${routingSummary}`, | |
| ); | |
| if (log.isEnabled("debug")) { | |
| const routingSummary = describeProviderRequestRoutingSummary({ | |
| provider: params.provider, | |
| api: params.model.api, | |
| baseUrl: params.model.baseUrl, | |
| capability: "llm", | |
| transport: "stream", | |
| }); | |
| log.debug( | |
| `embedded run prompt start: runId=${params.runId} sessionId=${params.sessionId} ` + | |
| `routing=${routingSummary}`, | |
| ); | |
| } |
| function describeProviderRequestRouteClass( | ||
| policy: ProviderRequestPolicyResolution, | ||
| ): "default" | "native" | "proxy-like" | "local" | "invalid" { | ||
| if (policy.endpointClass === "default") { | ||
| return "default"; | ||
| } | ||
| if (policy.endpointClass === "invalid") { | ||
| return "invalid"; | ||
| } | ||
| if (policy.endpointClass === "local") { | ||
| return "local"; | ||
| } | ||
| if (policy.usesExplicitProxyLikeEndpoint) { | ||
| return "proxy-like"; | ||
| } | ||
| return "native"; | ||
| } |
There was a problem hiding this comment.
"proxy-like" label applied to non-OpenAI native vendors
usesExplicitProxyLikeEndpoint is true for any configured base URL that isn't one of the three OpenAI-family native endpoints (openai-public, openai-codex, azure-openai). This means known native vendor hosts — Groq (api.groq.com), Anthropic (api.anthropic.com), xAI, Mistral, etc. — all surface as route=proxy-like in the new log line, even though the endpoint=groq-native / endpoint=anthropic-public field correctly identifies them as native.
The mismatch can trip up log readers: endpoint=groq-native route=proxy-like looks contradictory. Since this is baked into the existing usesExplicitProxyLikeEndpoint semantics, the fix belongs in describeProviderRequestRouteClass — either use the broader isKnownNativeEndpoint signal, or add a comment explaining why only OpenAI-family endpoints count as "native" in this routing context.
Prompt To Fix With AI
This is a comment left during a code review.
Path: src/agents/provider-attribution.ts
Line: 638-654
Comment:
**"proxy-like" label applied to non-OpenAI native vendors**
`usesExplicitProxyLikeEndpoint` is `true` for any configured base URL that isn't one of the three OpenAI-family native endpoints (`openai-public`, `openai-codex`, `azure-openai`). This means known native vendor hosts — Groq (`api.groq.com`), Anthropic (`api.anthropic.com`), xAI, Mistral, etc. — all surface as `route=proxy-like` in the new log line, even though the `endpoint=groq-native` / `endpoint=anthropic-public` field correctly identifies them as native.
The mismatch can trip up log readers: `endpoint=groq-native route=proxy-like` looks contradictory. Since this is baked into the existing `usesExplicitProxyLikeEndpoint` semantics, the fix belongs in `describeProviderRequestRouteClass` — either use the broader `isKnownNativeEndpoint` signal, or add a comment explaining why only OpenAI-family endpoints count as "native" in this routing context.
How can I resolve this? If you propose a fix, please make it concise.There was a problem hiding this comment.
Addressed in 7412638cde. The route classifier now treats only custom and openrouter endpoints as proxy-like; known vendor-native endpoint classes now summarize as route=native.
| log.debug( | ||
| `embedded run prompt start: runId=${params.runId} sessionId=${params.sessionId} ` + | ||
| `routing=${routingSummary}`, | ||
| ); |
There was a problem hiding this comment.
Log format nests multi-key string under a single
routing= prefix
The current output looks like routing=provider=openai api=openai-responses ..., which is ambiguous for log aggregators (Loki, Datadog, grep) that split on the first =. Inlining the fields directly at the top level is the more machine-friendly convention:
| log.debug( | |
| `embedded run prompt start: runId=${params.runId} sessionId=${params.sessionId} ` + | |
| `routing=${routingSummary}`, | |
| ); | |
| log.debug( | |
| `embedded run prompt start: runId=${params.runId} sessionId=${params.sessionId} ` + | |
| routingSummary, | |
| ); |
Prompt To Fix With AI
This is a comment left during a code review.
Path: src/agents/pi-embedded-runner/run/attempt.ts
Line: 1734-1737
Comment:
**Log format nests multi-key string under a single `routing=` prefix**
The current output looks like `routing=provider=openai api=openai-responses ...`, which is ambiguous for log aggregators (Loki, Datadog, grep) that split on the first `=`. Inlining the fields directly at the top level is the more machine-friendly convention:
```suggestion
log.debug(
`embedded run prompt start: runId=${params.runId} sessionId=${params.sessionId} ` +
routingSummary,
);
```
How can I resolve this? If you propose a fix, please make it concise.There was a problem hiding this comment.
Addressed in 7412638cde. The prompt-start log now appends the routing summary fields directly instead of nesting them behind routing=.
| it("summarizes proxy-like, local, and native routing compactly", () => { | ||
| expect( | ||
| describeProviderRequestRoutingSummary({ | ||
| provider: "openai", | ||
| api: "openai-responses", | ||
| baseUrl: "https://proxy.example.com/v1", | ||
| transport: "stream", | ||
| capability: "llm", | ||
| }), | ||
| ).toBe("provider=openai api=openai-responses endpoint=custom route=proxy-like policy=none"); | ||
|
|
||
| expect( | ||
| describeProviderRequestRoutingSummary({ | ||
| provider: "qwen", | ||
| api: "openai-responses", | ||
| baseUrl: "http://localhost:1234/v1", | ||
| transport: "stream", | ||
| capability: "llm", | ||
| }), | ||
| ).toBe("provider=qwen api=openai-responses endpoint=local route=local policy=none"); | ||
|
|
||
| expect( | ||
| describeProviderRequestRoutingSummary({ | ||
| provider: "openai", | ||
| api: "openai-responses", | ||
| baseUrl: "https://api.openai.com/v1", | ||
| transport: "stream", | ||
| capability: "llm", | ||
| }), | ||
| ).toBe( | ||
| "provider=openai api=openai-responses endpoint=openai-public route=native policy=hidden", | ||
| ); | ||
|
|
||
| expect( | ||
| describeProviderRequestRoutingSummary({ | ||
| provider: "openrouter", | ||
| api: "openai-responses", | ||
| baseUrl: "https://openrouter.ai/api/v1", | ||
| transport: "stream", | ||
| capability: "llm", | ||
| }), | ||
| ).toBe( | ||
| "provider=openrouter api=openai-responses endpoint=openrouter route=proxy-like policy=documented", | ||
| ); | ||
| }); |
There was a problem hiding this comment.
Missing coverage for "default" and "invalid" route classes
The new test covers custom (proxy-like), local, openai-public (native), and openrouter (proxy-like), but the "default" (no baseUrl) and "invalid" (malformed baseUrl) branches of describeProviderRequestRouteClass are unexercised. Two extra assertions would close the gap:
// default — no baseUrl
expect(
describeProviderRequestRoutingSummary({ provider: "openai", api: "openai-responses" }),
).toBe("provider=openai api=openai-responses endpoint=default route=default policy=none");
// invalid — unparseable baseUrl
expect(
describeProviderRequestRoutingSummary({
provider: "openai",
api: "openai-responses",
baseUrl: "javascript:alert(1)",
}),
).toBe("provider=openai api=openai-responses endpoint=invalid route=invalid policy=none");Prompt To Fix With AI
This is a comment left during a code review.
Path: src/agents/provider-attribution.test.ts
Line: 307-351
Comment:
**Missing coverage for "default" and "invalid" route classes**
The new test covers `custom` (proxy-like), `local`, `openai-public` (native), and `openrouter` (proxy-like), but the `"default"` (no `baseUrl`) and `"invalid"` (malformed `baseUrl`) branches of `describeProviderRequestRouteClass` are unexercised. Two extra assertions would close the gap:
```ts
// default — no baseUrl
expect(
describeProviderRequestRoutingSummary({ provider: "openai", api: "openai-responses" }),
).toBe("provider=openai api=openai-responses endpoint=default route=default policy=none");
// invalid — unparseable baseUrl
expect(
describeProviderRequestRoutingSummary({
provider: "openai",
api: "openai-responses",
baseUrl: "javascript:alert(1)",
}),
).toBe("provider=openai api=openai-responses endpoint=invalid route=invalid policy=none");
```
How can I resolve this? If you propose a fix, please make it concise.There was a problem hiding this comment.
Addressed in 7412638cde. Added assertions for the default and invalid route classes, plus a known-native Groq case so the route label stays aligned with the endpoint class.
🔒 Aisle Security AnalysisWe found 1 potential security issue(s) in this PR:
1. 🔵 Log forging (CRLF injection) via unsanitized provider/api values in embedded-agent debug logs
Description
Vulnerable code: return [
`provider=${provider}`,
`api=${api}`,
...
].join(" ");RecommendationSanitize/escape control characters before including these values in log messages, or (preferably) log them as structured fields so the logger handles escaping. Example sanitization helper: function sanitizeForLog(value: string): string {
// Replace ASCII control chars (0x00-0x1F, 0x7F) with safe escapes
return value.replace(/[\x00-\x1F\x7F]/g, (ch) => {
switch (ch) {
case "\n": return "\\n";
case "\r": return "\\r";
case "\t": return "\\t";
default: return `\\x${ch.charCodeAt(0).toString(16).padStart(2, "0")}`;
}
});
}
const api = sanitizeForLog(normalizeOptionalLowercaseString(input.api) ?? "unknown");
const provider = sanitizeForLog(policy.provider ?? "unknown");Or, if the logger supports it, switch to structured logging: log.debug({ runId, sessionId, provider, api, endpoint: policy.endpointClass, route: routeClass, policy: routingPolicy },
"embedded run prompt start");Analyzed PR: #64754 at commit Last updated on: 2026-04-11T11:38:25Z |
c9731a5 to
3a668e9
Compare
|
Merged via squash.
|
Summary
Testing
pnpm test src/agents/provider-attribution.test.tspnpm test src/agents/pi-embedded-runner/run/attempt.test.ts