Description
When using the .NET Copilot SDK with a BYOK (Bring Your Own Key) Azure OpenAI provider, setting ReasoningEffort on SessionConfig for gpt-5.4-mini throws:
System.IO.IOException: Communication error with Copilot CLI: Request session.create failed with message: Model 'gpt-5.4-mini' does not support reasoning effort configuration. Use models.list to check which models support reasoning effort.
Expected Behavior
gpt-5.4-mini is a reasoning model and supports reasoning effort configuration via the Azure OpenAI Responses API. The SDK should either:
- Pass
ReasoningEffort through to the provider API without validating against its own model registry, OR
- Include
gpt-5.4-mini (and other GPT-5.x models) in its reasoning model allowlist
When using BYOK providers, the SDK should not gate ReasoningEffort based on its own internal model catalog — the provider endpoint knows what its deployed models support.
Reproduction
var sessionConfig = new SessionConfig
{
Model = "gpt-5.4-mini",
ReasoningEffort = "high",
Provider = new ProviderConfig
{
Type = "openai",
BaseUrl = "https://my-aoai.cognitiveservices.azure.com/openai/v1/",
BearerToken = "...",
},
};
// Throws: "Model 'gpt-5.4-mini' does not support reasoning effort configuration"
await client.CreateSessionAsync(sessionConfig);
Environment
- SDK versions tested: 0.1.32 and 0.2.1 (NuGet: GitHub.Copilot.SDK) — fails on both
- Runtime: .NET 8.0
- Provider: Azure OpenAI (BYOK) with WireApi = "responses"
Workaround
Catching the exception and retrying without ReasoningEffort, but this degrades model behavior since reasoning effort cannot be configured for BYOK deployments.
Description
When using the .NET Copilot SDK with a BYOK (Bring Your Own Key) Azure OpenAI provider, setting
ReasoningEffortonSessionConfigforgpt-5.4-minithrows:Expected Behavior
gpt-5.4-miniis a reasoning model and supports reasoning effort configuration via the Azure OpenAI Responses API. The SDK should either:ReasoningEffortthrough to the provider API without validating against its own model registry, ORgpt-5.4-mini(and other GPT-5.x models) in its reasoning model allowlistWhen using BYOK providers, the SDK should not gate
ReasoningEffortbased on its own internal model catalog — the provider endpoint knows what its deployed models support.Reproduction
Environment
Workaround
Catching the exception and retrying without
ReasoningEffort, but this degrades model behavior since reasoning effort cannot be configured for BYOK deployments.