Problem Statement
The current CacheConfig(strategy="auto") in BedrockModel only injects a single cache point on the last user message (the "moving tail"). There is no stable/anchor cache point placed on the first user message to cover the system prompt + initial user prompt + tool descriptions.
This means that whenever the moving tail gets invalidated (e.g., due to context pruning, summarisation, or conversation reset), the entire prompt cache is lost and must be rebuilt from scratch. There is no fallback cache prefix to absorb the cost of the stable portions of the prompt.
Proposed Solution
Modify _inject_cache_point() to inject two cache points instead of one:
-
Stable prefix: A cache point appended to the first user message's content. This anchors the cache over the system prompt + first user message + tool descriptions - content that rarely changes across turns.
-
Moving tail: A cache point appended to the last user message's content (current behavior). This advances with the conversation to maximise cache hit rate on recent context.
Use Case
- Long-running agents with context management: Agents that use context pruning or summarisation to stay within token limits currently lose their entire prompt cache when the moving tail is invalidated. A stable prefix ensures the system prompt + tool definitions remain cached, significantly reducing cost and latency after pruning events.
- Cost optimization for tool-heavy agents: Agents with large tool configurations (many tools with detailed schemas) pay the full cache write cost on every pruning cycle. A stable prefix covering system prompt + tools avoids re-caching this static content repeatedly.
Alternatives Solutions
No response
Additional Context
No response
Problem Statement
The current
CacheConfig(strategy="auto")inBedrockModelonly injects a single cache point on the last user message (the "moving tail"). There is no stable/anchor cache point placed on the first user message to cover the system prompt + initial user prompt + tool descriptions.This means that whenever the moving tail gets invalidated (e.g., due to context pruning, summarisation, or conversation reset), the entire prompt cache is lost and must be rebuilt from scratch. There is no fallback cache prefix to absorb the cost of the stable portions of the prompt.
Proposed Solution
Modify
_inject_cache_point()to inject two cache points instead of one:Stable prefix: A cache point appended to the first user message's content. This anchors the cache over the system prompt + first user message + tool descriptions - content that rarely changes across turns.
Moving tail: A cache point appended to the last user message's content (current behavior). This advances with the conversation to maximise cache hit rate on recent context.
Use Case
Alternatives Solutions
No response
Additional Context
No response