Skip to content

Conversation

@ihower
Copy link
Contributor

@ihower ihower commented Nov 18, 2025

Add prompt_cache_retention to ModelSettings and pass it to the API to enable optional 24h extended prompt caching. This feature is supported by both the Responses API and the Chat Completions API.

See https://platform.openai.com/docs/guides/prompt-caching#prompt-cache-retention

@seratch seratch added enhancement New feature or request feature:core labels Nov 18, 2025
@seratch seratch added this to the 0.6.x milestone Nov 18, 2025
Copy link
Member

@seratch seratch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you!

@seratch seratch merged commit 48164ec into openai:main Nov 18, 2025
9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request feature:core

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants