Skip to content

Simplify retry options propagation and fix OpenAI fetch configuration#1817

Merged
pelikhan merged 8 commits intodevfrom
copilot/fix-6a80be4d-e0a9-41af-8500-ec38578525ff
Aug 2, 2025
Merged

Simplify retry options propagation and fix OpenAI fetch configuration#1817
pelikhan merged 8 commits intodevfrom
copilot/fix-6a80be4d-e0a9-41af-8500-ec38578525ff

Conversation

Copy link
Copy Markdown

Copilot AI commented Aug 2, 2025

This PR refines the retry options propagation implementation based on code review feedback, simplifying the configuration approach while ensuring proper option flow to all fetch calls.

Changes Made

Simplified Configuration Approach

Removed environment variable fallbacks from retry options resolution in expandTemplate(). The implementation now uses a cleaner precedence model:

// Before: Complex env.vars lookup with multiple naming conventions
const retries = options.retries ?? normalizeInt(env.vars["retries"]) ?? normalizeInt(env.vars["retry_delay"]) ?? template.retries;

// After: Simple and clear precedence
const retries = options.retries ?? template.retries;

OpenAI Provider Configuration

Updated OpenAIListModels to maintain retries: 0 for model listing operations, as these should not be retried to avoid unnecessary delays during model discovery:

const fetch = await createFetch({ retries: 0, ...(options || {}) });

Sample Script and Documentation

Added a comprehensive sample script retry-options-demo.genai.mts demonstrating:

  • Script-level retry configuration with all available options
  • Runtime retry option overrides using runPrompt()
  • Practical use cases and explanations

Updated documentation in metadata.md with detailed retry options reference including:

  • Complete parameter descriptions and defaults
  • Usage examples for both script-level and runtime configuration
  • Explanation of retry scenarios (rate limiting, server errors, network failures)

Retry Options Flow

Retry options now propagate cleanly through the execution chain:

PromptScript.retries → expandTemplate() → GenerationOptions → ChatCompletionsOptions → createFetch() → LLM Provider

The runPrompt function automatically inherits retry options through mergeGenerationOptions(), since ModelOptions extends RetryOptions.

Backward Compatibility

  • All existing retry functionality remains intact
  • Template-level retry configuration continues to work as before
  • Runtime option overrides still function correctly
  • No breaking changes to the public API

This simplified approach maintains the core functionality while reducing complexity and ensuring consistent retry behavior across all LLM provider operations, with appropriate exceptions for operations like model listing that should not be retried.


💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.

Co-authored-by: pelikhan <4175913+pelikhan@users.noreply.github.com>
Copilot AI changed the title [WIP] The PromptArgs interface should inherit from RetryOptions and the retry parameters should be propagated all the way to the fetch arguments in openai.ts invocations (and anthropic). Propagate retry options from PromptArgs to fetch calls in LLM providers Aug 2, 2025
Copilot AI requested a review from pelikhan August 2, 2025 11:28
Copy link
Copy Markdown
Member

@pelikhan pelikhan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pass the options in fetch in openai.ts

Comment thread packages/core/src/expander.ts Outdated
@pelikhan
Copy link
Copy Markdown
Member

pelikhan commented Aug 2, 2025

Add support in runPrompt

@pelikhan
Copy link
Copy Markdown
Member

pelikhan commented Aug 2, 2025

@copilot make sure the variables are propagated to the fetch call inside the completer

…h options pass-through

Co-authored-by: pelikhan <4175913+pelikhan@users.noreply.github.com>
Copy link
Copy Markdown
Author

Copilot AI commented Aug 2, 2025

Pass the options in fetch in openai.ts

Fixed in commit 7e9aaa0. Updated OpenAIListModels to pass retry options through to createFetch() instead of forcing retries: 0. All other OpenAI functions already properly pass options to fetch calls.

Copy link
Copy Markdown
Author

Copilot AI commented Aug 2, 2025

Add support in runPrompt

runPrompt already supports retry options through mergeGenerationOptions() which spreads both base options and runOptions. Since ModelOptions extends RetryOptions, retry options are automatically merged when calling runPrompt.

Copy link
Copy Markdown
Author

Copilot AI commented Aug 2, 2025

@copilot make sure the variables are propagated to the fetch call inside the completer

Fixed in commit 7e9aaa0. Retry options now propagate through the complete chain: PromptScript → expandTemplate() → GenerationOptions → ChatCompletionsOptions → createFetch() → LLM Provider. Both OpenAI and Anthropic providers correctly extract and pass retry options to fetch calls.

Comment thread packages/core/src/openai.ts Outdated
Copilot AI changed the title Propagate retry options from PromptArgs to fetch calls in LLM providers Simplify retry options propagation and fix OpenAI fetch configuration Aug 2, 2025
Copilot AI requested a review from pelikhan August 2, 2025 13:26
Comment thread packages/core/test/retry-options-propagation.test.ts Outdated
Copy link
Copy Markdown
Member

@pelikhan pelikhan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • add sample script under samples/sample/genaisrc that uses those new paramters for script and runPrompt
  • update the documentation

@pelikhan
Copy link
Copy Markdown
Member

pelikhan commented Aug 2, 2025

@copilot apply comments

… test file, add sample and documentation

Co-authored-by: pelikhan <4175913+pelikhan@users.noreply.github.com>
Copy link
Copy Markdown
Author

Copilot AI commented Aug 2, 2025

@copilot apply comments

Applied all comments in commit 87b5384:

  • Kept retries: 0 for OpenAIListModels
  • Deleted test file
  • Added sample script with retry options demo
  • Updated documentation with retry configuration details

Copy link
Copy Markdown
Author

Copilot AI commented Aug 2, 2025

  • add sample script under samples/sample/genaisrc that uses those new paramters for script and runPrompt
  • update the documentation

Added sample script samples/sample/genaisrc/retry-options-demo.genai.mts demonstrating retry parameters for both script() and runPrompt(), and updated documentation in docs/src/content/docs/reference/scripts/metadata.md with comprehensive retry options guidance in commit 87b5384.

Copilot AI requested a review from pelikhan August 2, 2025 15:09
@pelikhan pelikhan marked this pull request as ready for review August 2, 2025 16:29
@pelikhan pelikhan merged commit 0ff18d9 into dev Aug 2, 2025
11 checks passed
@pelikhan pelikhan deleted the copilot/fix-6a80be4d-e0a9-41af-8500-ec38578525ff branch August 4, 2025 08:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants