Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integer overflow when the amount of element overtakes 2^25 #111

Closed
detohaz opened this issue Apr 19, 2019 · 11 comments

Comments

Projects
None yet
3 participants
@detohaz
Copy link

commented Apr 19, 2019

Take a look at method cache2k-api\Hash2.java:Hash2<K,V>->calcMaxFill(), you should be noticed that there is an Integer Overflow side effect here. When the amount of element overtakes 2^24 and HeapCache.TUNABLE.hashLoadPercent value is default (64), doubling the hash table size to 2^25 causes segmentMaxFill to negative (and 0 for the latter, which causes NegativeArraySizeException) because of Integer Overflow.

I've rebuilded your library with an adjustment for calcMaxFill method:
"Math.round((float)entries.length * HeapCache.TUNABLE.hashLoadPercent / 100 / LOCK_SEGMENTS);" and it works like a charm.

Your friend,
H.B Xuyen

@cruftex cruftex self-assigned this Apr 19, 2019

@cruftex cruftex added the bug label Apr 19, 2019

@cruftex

This comment has been minimized.

Copy link
Member

commented Apr 19, 2019

@detohaz: Many thanks for the bug report!
Probably I never did test with that many entries. What is the heap size and cache size of your application?

@detohaz

This comment has been minimized.

Copy link
Author

commented Apr 19, 2019

I'm sorry that I didn't pay much attention to heap size and cache size cause I only care where the problem comes from (your expression to calculate segmentMaxFill) and how to fix it. May be it's redundant, but I would like to attach an explanation here with hashLoadPercent = 64 and LOCK_SEGMENTS = 64 too (my case).
Capture
When segmentMaxFill becomes negative or zero, the 'entries' array is doubled whenever a new element is added and the size of the array rockets to be a negative value (1.073.741.824 -> -2.147.483.648) because of Integer Overflow.
In my case, the number of element is around 20 millions, but the exception occurs when 10.674.603 elements are added.
Hope this helpful for you.

@cruftex

This comment has been minimized.

Copy link
Member

commented Apr 19, 2019

I'm sorry that I didn't pay much attention to heap size and cache size cause I only care where the problem comes from (your expression to calculate segmentMaxFill) and how to fix it.

I appreciate that.

Please mind that you are reporting a problem, but its unclear in what usage scenario it happens and how severe it is. For deciding on how to proceed and for the information of other users it is helpful to give information on how to reproduce a problem.

Depending on your usage, environment and configuration the hash table sizes can look very different.

@cruftex

This comment has been minimized.

Copy link
Member

commented Apr 19, 2019

When the amount of element overtakes 2^24 and HeapCache.TUNABLE.hashLoadPercent value is default (64)

So I assume the worst case would be that the problem triggers at 2^24 * 64% = 10M entries.

@detohaz

This comment has been minimized.

Copy link
Author

commented Apr 19, 2019

I'm sorry for the lack of information, but would you please clarify which cache size and heap size you need? The ones of JVM global flags or of running instance?

@cruftex

This comment has been minimized.

Copy link
Member

commented Apr 19, 2019

I'm sorry for the lack of information, but would you please clarify which cache size and heap size you need? The ones of JVM global flags or of running instance?

The actual instance sizes, when the problem hits would be interesting.

Meanwhile I produced a test and a fix. Maybe you want to check:
3c9ccff

The test does just call the hash table expand directly and does not add entries to the cache, to minimize the test runtime and footprint.

@detohaz

This comment has been minimized.

Copy link
Author

commented Apr 19, 2019

The total memory of running instance at that time is around 11.4Gb, heap size 11506Mb (just for cache2k storage, and JVM MaxHeapSize is 27305Mb).
Total of element in the cache at exception time is 10.674.603.

Is that what you mean? Please correct me if I'm wrong.

@cruftex

This comment has been minimized.

Copy link
Member

commented Apr 20, 2019

@detohaz: Perfect, thanks for the information!
That is really bad luck, I did test with 10 million entries and thought probably no one will have a bigger cache ;) Sorry for the inconvenience caused by this bug.

@pompeywebb

This comment has been minimized.

Copy link

commented Jun 7, 2019

When will the fix for this bug be released?

@cruftex

This comment has been minimized.

Copy link
Member

commented Jun 7, 2019

@pompeywebb, thanks for the "ping". I just release version 1.2.2.Final to Maven central, which includes the fix.

@cruftex cruftex closed this Jun 7, 2019

@pompeywebb

This comment has been minimized.

Copy link

commented Jun 7, 2019

@cruftex many thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.