Python: make FoundryChatClient an async context manager#5464
Python: make FoundryChatClient an async context manager#5464he-yufeng wants to merge 2 commits intomicrosoft:mainfrom
Conversation
… tool loop Fixes microsoft#5394. When `background=True` is combined with local function tools, `FunctionInvocationLayer` calls `_inner_get_response(options=mutable_options)` repeatedly with the same dict reference across loop iterations. Once the first poll retrieves a completed background response, `continuation_token` stays in `mutable_options`, so every subsequent iteration takes the `continuation_token is not None` branch and `GET`s the same completed response instead of `POST`ing the tool results. The loop exits after `max_iterations` with empty text and the model never sees any tool output. After the retrieve, if the returned `ChatResponse.continuation_token` is `None` (the background response is no longer in progress), pop `continuation_token` and `background` from the shared options dict in place. The next loop iteration then falls through to the normal `responses.create`/`parse` path and posts tool results. The diagnosis and a verified runtime monkeypatch are in the issue; this is the same fix moved in-tree.
Fixes microsoft#5428. `built_in_chat_clients` and similar samples wrap the client in `async with client:`, which works for OpenAIChatClient and the Azure variants but fails on FoundryChatClient with `TypeError: object does not support the asynchronous context manager protocol`. The chat client holds an AIProjectClient internally (self- created when the caller passes project_endpoint + credential), so it also needs a lifecycle hook — otherwise the project client leaks. Mirror the pattern already used in `FoundryChatAgent`: - Track `_should_close_client` in `__init__`, flipped on only when we construct the project client ourselves. - Add `async def close()` that awaits `project_client.close()` only when owned. - Implement `__aenter__` (no-op, returns self) and `__aexit__` (awaits close). `FoundryChatClient` inherits both via the existing class hierarchy, so the sample starts working without any call-site changes.
|
@he-yufeng please read the following Contributor License Agreement(CLA). If you agree with the CLA, please reply with the following information.
Contributor License AgreementContribution License AgreementThis Contribution License Agreement (“Agreement”) is agreed to by the party signing below (“You”),
|
eavanvalkenburg
left a comment
There was a problem hiding this comment.
so, I need to think about this one, because currently no chat client has aenter, so this would introduce it, I'm not opposed, but need to think about this generally. I will close this for now (I removed the async with from the sample you referred to)
There was a problem hiding this comment.
this is from a different PR, please remove here
Fixes #5428.
The `built_in_chat_clients` sample (and any other code path that does `async with client:`) wraps the chat client in an async context manager. This works for `OpenAIChatClient` and the Azure variants but fails on `FoundryChatClient` with:
```
TypeError: FoundryChatClient object does not support the asynchronous context manager protocol (missed aexit method)
```
`FoundryChatClient` owns an `AIProjectClient` internally — self-created when the caller passes `project_endpoint` + `credential`, otherwise injected — so beyond the missing protocol methods, leaving without closing the self-created `project_client` also leaks it.
Fix
Mirror the pattern already used in `FoundryChatAgent` (`_agent.py`):
`FoundryChatClient` inherits these via the existing class hierarchy, so the sample starts working without any call-site changes and without touching the public API.
Test plan