feat(requests-result): 🔥 Add ttl to updateRequestsCache #477
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Add a ttl parameter to the updateRequestsCache function, allowing users to set an expiration time for each cache record when updating multiple keys. If a ttl value isn't provided, the cache record doesn't expire.
✅ Closes: 471
PR Checklist
Please check if your PR fulfills the following requirements:
PR Type
What kind of change does this PR introduce?
What is the current behavior?
Currently, the updateRequestsCache function updates the cache state for a given set of keys to a specified value. However, it doesn't allow for setting an expiration time for these cache records.
Issue Number: 471
What is the new behavior?
With this update, users can now set a ttl parameter when calling the updateRequestsCache function. This parameter allows for setting an expiration time for each cache record. After the set time has elapsed, the cache record for the key will revert to 'none'. If a ttl value isn't provided, the cache record will persist and won't automatically expire.
Does this PR introduce a breaking change?
Other information
The introduction of the ttl parameter enhances the flexibility and control users have over cache management in their applications. This feature can be particularly useful in scenarios where the validity of data might change over time.