-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(indexer): Improve cache memory efficiency #260
Conversation
64e5a79
to
df8fbac
Compare
Codecov Report
@@ Coverage Diff @@
## main #260 +/- ##
==========================================
- Coverage 63.04% 62.18% -0.86%
==========================================
Files 34 34
Lines 3810 3819 +9
==========================================
- Hits 2402 2375 -27
- Misses 1207 1246 +39
+ Partials 201 198 -3
Continue to review full report at Codecov.
|
431fd93
to
d59f37c
Compare
While I am marking this ready for review, I'm not sure how much of a performance difference this will make. |
d59f37c
to
930f82f
Compare
Instead of using a complicated cache implementation, just use a handful of sync.Maps along with a "remove eldest" replacement policy.
dfccdc6
to
342ea05
Compare
Receipts []*model.Receipt | ||
Block *model.Block | ||
Receipts []*model.Receipt | ||
UniqueTxes []*model.Transaction |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, yeah this is correct, I shouldn't have changed this
Instead of using a complicated cache implementation, just use a handful
of sync.Maps along with a "remove eldest" replacement policy. Since our
critical-path access patterns are overwhelmingly for temporally recent data
the new policy also makes it impossible for the hot dataset to be evicted.
Implements #252