Description
default_headers on FoundryChatClient and FoundryAgent is accepted and stored on the instance, but never reaches outbound HTTP requests. The underlying AsyncOpenAI client is built by AIProjectClient.get_openai_client() which is called without forwarding the headers. The documented parameter is effectively dead.
Discovered while trying to add custom header to every outbound call. Tracking down why the header never arrived cost several hours because the parameter looks wired up.
Code Sample
No network or real credentials needed — I inspected the underlying AsyncOpenAI client's _custom_headers directly, which is what actually rides on every outbound request.
"""
Repro: FoundryChatClient(default_headers=...) is silently ignored,
while OpenAIChatClient(default_headers=...) works correctly.
pip install agent-framework-foundry agent-framework-openai azure-identity
python repro.py
"""
from agent_framework.foundry import FoundryChatClient
from agent_framework_openai import OpenAIChatClient
from azure.identity import DefaultAzureCredential
CUSTOM_HEADERS = {"x-custom-header": "repro-value"}
# --- FoundryChatClient: header is DROPPED ---
foundry = FoundryChatClient(
project_endpoint="https://example.services.ai.azure.com",
model="gpt-4o",
credential=DefaultAzureCredential(),
default_headers=CUSTOM_HEADERS,
)
foundry_outbound = dict(foundry.client._custom_headers)
print("FoundryChatClient outbound headers:", foundry_outbound)
assert "x-custom-header" not in foundry_outbound, "BUG REPRODUCED"
print(" -> x-custom-header MISSING (bug)\n")
# --- OpenAIChatClient (kwarg path): header is kept ---
openai_client = OpenAIChatClient(
model="gpt-4o",
base_url="https://example.openai.azure.com/openai/v1",
api_key="placeholder",
default_headers=CUSTOM_HEADERS,
)
openai_outbound = dict(openai_client.client._custom_headers)
print("OpenAIChatClient outbound headers:", openai_outbound)
assert "x-custom-header" in openai_outbound
print(" -> x-custom-header PRESENT (correct)")
Expected output
FoundryChatClient outbound headers: {}
-> x-custom-header MISSING (bug)
OpenAIChatClient outbound headers: {'x-custom-header': 'repro-value'}
-> x-custom-header PRESENT (correct)
Expected vs actual
| Action |
Expected |
Actual |
FoundryChatClient(default_headers={"X": "1"}) → request |
request carries X: 1 |
header dropped |
FoundryAgent(default_headers={"X": "1"}) → request |
request carries X: 1 |
header dropped |
OpenAIChatClient(default_headers={"X": "1"}) with kwargs (no pre-built client) |
request carries X: 1 |
✅ works |
OpenAIChatClient(async_client=..., default_headers={"X": "1"}) (pre-built client) |
request carries X: 1 |
header dropped (same short-circuit) |
Error Messages / Stack Traces
Package Versions
agent-framework-foundry 1.0.0 and 1.1.0 (all released versions)
agent-framework-openai 1.0.0 and 1.1.0 (same short-circuit in _shared.py)
- Transitives:
openai==2.30.0, azure-ai-projects==2.0.1
Python Version
3.13
Additional Context
Root cause
Two sites, same shape of bug.
1. agent_framework_foundry/_chat_client.py (RawFoundryChatClient.__init__, around line 202 in 1.0.0 / line 207 in 1.1.0)
super().__init__(
model=resolved_model,
async_client=project_client.get_openai_client(), # ← default_headers NOT forwarded
default_headers=default_headers, # ← stored on self.default_headers, never used
...
)
AIProjectClient.get_openai_client() already accepts default_headers in **kwargs and forwards them into AsyncOpenAI(default_headers=...) — but it's called with no arguments here.
2. agent_framework_foundry/_agent.py (_RawFoundryAgentClient.__init__, around line 199)
async_client = self.project_client.get_openai_client() # same gap
super().__init__(
async_client=async_client,
default_headers=default_headers, # same no-op storage
...
)
3. agent_framework_openai/_shared.py (pre-built async_client short-circuit)
if client:
return openai_settings, client, False # headers in `merged_headers` are never applied to the client
When a caller supplies async_client=..., default_headers from the kwargs is collected into merged_headers and then silently thrown away. This is what makes the Foundry path lose the headers even after super().__init__().
Proposed fix
_chat_client.py — forward headers into get_openai_client:
async_client = project_client.get_openai_client(
default_headers=dict(default_headers) if default_headers else None,
)
super().__init__(
model=resolved_model,
async_client=async_client,
instruction_role=instruction_role,
compaction_strategy=compaction_strategy,
tokenizer=tokenizer,
additional_properties=additional_properties,
)
Same change needed in _agent.py.
_shared.py — when a caller supplies async_client=... and default_headers=..., either merge the headers into the existing client's _custom_headers or raise/warn that the combination is unsupported. Silent-drop is the worst
option.
Workaround for users in the meantime
The options dict supports per-call extra_headers, which the framework forwards correctly to client.responses.parse(extra_headers=...):
options = {
"temperature": 0.2,
"extra_headers": {"x-custom-header": "value"},
}
response = await agent.run(messages, options=options)
Not a permanent fix — it requires callers to remember to pass the headers on every call — but it works today without SDK changes.
Scope / risk of the fix
Low. AIProjectClient.get_openai_client(default_headers=...) is already the supported API (azure-ai-projects). The fix just forwards an existing parameter through one extra boundary. No behavior change for callers who don't pass default_headers.
Related
Description
default_headersonFoundryChatClientandFoundryAgentis accepted and stored on the instance, but never reaches outbound HTTP requests. The underlyingAsyncOpenAIclient is built byAIProjectClient.get_openai_client()which is called without forwarding the headers. The documented parameter is effectively dead.Discovered while trying to add custom header to every outbound call. Tracking down why the header never arrived cost several hours because the parameter looks wired up.
Code Sample
No network or real credentials needed — I inspected the underlying
AsyncOpenAIclient's_custom_headersdirectly, which is what actually rides on every outbound request.Expected output
Expected vs actual
FoundryChatClient(default_headers={"X": "1"})→ requestX: 1FoundryAgent(default_headers={"X": "1"})→ requestX: 1OpenAIChatClient(default_headers={"X": "1"})with kwargs (no pre-built client)X: 1OpenAIChatClient(async_client=..., default_headers={"X": "1"})(pre-built client)X: 1Error Messages / Stack Traces
Package Versions
agent-framework-foundry1.0.0 and 1.1.0 (all released versions)agent-framework-openai1.0.0 and 1.1.0 (same short-circuit in_shared.py)openai==2.30.0,azure-ai-projects==2.0.1Python Version
3.13
Additional Context
Root cause
Two sites, same shape of bug.
1.
agent_framework_foundry/_chat_client.py(RawFoundryChatClient.__init__, around line 202 in 1.0.0 / line 207 in 1.1.0)AIProjectClient.get_openai_client()already acceptsdefault_headersin**kwargsand forwards them intoAsyncOpenAI(default_headers=...)— but it's called with no arguments here.2.
agent_framework_foundry/_agent.py(_RawFoundryAgentClient.__init__, around line 199)3.
agent_framework_openai/_shared.py(pre-builtasync_clientshort-circuit)When a caller supplies
async_client=...,default_headersfrom the kwargs is collected intomerged_headersand then silently thrown away. This is what makes the Foundry path lose the headers even aftersuper().__init__().Proposed fix
_chat_client.py — forward headers into
get_openai_client:Same change needed in
_agent.py._shared.py — when a caller supplies
async_client=...anddefault_headers=..., either merge the headers into the existing client's_custom_headersor raise/warn that the combination is unsupported. Silent-drop is the worstoption.
Workaround for users in the meantime
The options dict supports per-call
extra_headers, which the framework forwards correctly toclient.responses.parse(extra_headers=...):Not a permanent fix — it requires callers to remember to pass the headers on every call — but it works today without SDK changes.
Scope / risk of the fix
Low.
AIProjectClient.get_openai_client(default_headers=...)is already the supported API (azure-ai-projects). The fix just forwards an existing parameter through one extra boundary. No behavior change for callers who don't passdefault_headers.Related
header_providerhooks for MCP tool clients but didn't extend the concept to the chat client itself.