Simplify retry options propagation and fix OpenAI fetch configuration#1817
Simplify retry options propagation and fix OpenAI fetch configuration#1817
Conversation
Co-authored-by: pelikhan <4175913+pelikhan@users.noreply.github.com>
PromptArgs interface should inherit from RetryOptions and the retry parameters should be propagated all the way to the fetch arguments in openai.ts invocations (and anthropic).
pelikhan
left a comment
There was a problem hiding this comment.
Pass the options in fetch in openai.ts
|
Add support in runPrompt |
|
@copilot make sure the variables are propagated to the fetch call inside the completer |
…h options pass-through Co-authored-by: pelikhan <4175913+pelikhan@users.noreply.github.com>
Fixed in commit 7e9aaa0. Updated OpenAIListModels to pass retry options through to createFetch() instead of forcing retries: 0. All other OpenAI functions already properly pass options to fetch calls. |
runPrompt already supports retry options through mergeGenerationOptions() which spreads both base options and runOptions. Since ModelOptions extends RetryOptions, retry options are automatically merged when calling runPrompt. |
Fixed in commit 7e9aaa0. Retry options now propagate through the complete chain: PromptScript → expandTemplate() → GenerationOptions → ChatCompletionsOptions → createFetch() → LLM Provider. Both OpenAI and Anthropic providers correctly extract and pass retry options to fetch calls. |
pelikhan
left a comment
There was a problem hiding this comment.
- add sample script under samples/sample/genaisrc that uses those new paramters for
scriptandrunPrompt - update the documentation
|
@copilot apply comments |
… test file, add sample and documentation Co-authored-by: pelikhan <4175913+pelikhan@users.noreply.github.com>
Added sample script |
…across the codebase
This PR refines the retry options propagation implementation based on code review feedback, simplifying the configuration approach while ensuring proper option flow to all fetch calls.
Changes Made
Simplified Configuration Approach
Removed environment variable fallbacks from retry options resolution in
expandTemplate(). The implementation now uses a cleaner precedence model:OpenAI Provider Configuration
Updated
OpenAIListModelsto maintainretries: 0for model listing operations, as these should not be retried to avoid unnecessary delays during model discovery:Sample Script and Documentation
Added a comprehensive sample script
retry-options-demo.genai.mtsdemonstrating:runPrompt()Updated documentation in
metadata.mdwith detailed retry options reference including:Retry Options Flow
Retry options now propagate cleanly through the execution chain:
The
runPromptfunction automatically inherits retry options throughmergeGenerationOptions(), sinceModelOptionsextendsRetryOptions.Backward Compatibility
This simplified approach maintains the core functionality while reducing complexity and ensuring consistent retry behavior across all LLM provider operations, with appropriate exceptions for operations like model listing that should not be retried.
💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.