-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
#5359 Fix interquery cache concurrency bug #5361
Conversation
if oldValueOk { | ||
c.usage -= oldValue.SizeInBytes() | ||
} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It would be better if we tried to lock the action from the caller rather than do this as described here. Although this helps in representing the correct cache size, we'll be unnecessarily adding items to the list.
✅ Deploy Preview for openpolicyagent ready!
To edit notification comments on pull requests, go to your Netlify site settings. |
@ashutosh-narkar I added a test and comment per your request |
I switched the fix with a one liner, |
@@ -138,6 +138,9 @@ func (c *cache) unsafeInsert(k ast.Value, v InterQueryCacheValue) (dropped int) | |||
} | |||
} | |||
|
|||
// By deleting the old value, if it exists, we ensure the usage variable stays correct |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we also explain the scenario where we need to take this action. For eg, the concurrent request case.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This fix simply prevents code like this from breaking the internal state of the cache:
cache.Insert(ast.StringTerm("foo6").Value, cacheValue6)
cache.Insert(ast.StringTerm("foo6").Value, cacheValue6)
I don't see the need to document very explicitly who might call cache.Insert
in such a way or why they might do it. At least I wouldn't know how to write that documentation. Feel free to update the comment or give me a suggestion on how to write it
|
||
if dropped != 0 { | ||
t.Fatal("Expected dropped to be zero") | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we add a test case like this to show the concurrent use case.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I added a unit test with concurrent inserts
@asleire can you please squash your commits and sign it as well. |
8db515e
to
9111b65
Compare
Done :) |
Signed-off-by: Aleksander <Alekken@live.no>
9111b65
to
2553a6c
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot!
This fixes the issue where concurrent requests with identical cache keys cause the interquery cache size
usage
counter to become invalid, as discussed here