Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak #126

Open
LennartRoeder opened this issue Nov 4, 2020 · 3 comments
Open

Memory leak #126

LennartRoeder opened this issue Nov 4, 2020 · 3 comments

Comments

@LennartRoeder
Copy link

Hi,
I had some serious memory issues. I could increase Memory to a certain point, but with the 90M entries I try to write, my performance drops, once I reach the memory limit with intensive GC activity. I solved it by simply switching to https://github.com/fusesource/leveldbjni.
Here is an example with a container limited to 900MB RAM. The same happens with 4, 8 or 12 GB RAM.
levelDB
levelDB2

@pcmind
Copy link
Contributor

pcmind commented Nov 4, 2020

Could you provide an example of code where it is possible to replicate your scenario? What kind of object is using all your memory? Witch object is retaining them?

Be aware that memory used by CPP code won't appear as heap usage nor will it be constraint by JVM max heap size.

@LennartRoeder
Copy link
Author

@pcmind I am aware that the cpp memory does not effect the JVM, but my kubernetes metrics show no leakage there and the app does not crash or slow down due to lack of memory.

My keys look like this: WS_SHP_ARTICLE_kubis-10000002-1
The payload is an org.json.JSONArray with org.json.JSONObjects inside.

My small example has 20 million keys and 30gb data (The test in the screenshots was done with 2M entries and 1GB RAM).
I use a HashMap and insert the object to each key and when I know my JSONArray is complete, I persist it once, so I don't have to deal with IO on lookups. I tested batch and no batch operations and also tried to modify the Options of the levelDB connection.
The only thing I did not try is to convert the JSONArray to binary before inserting.

As for example code: I am way behind on schedule and will be leaving my team soon, so I need to get things done and can't spend any time on supplying it. Plus, by switching the lib, I solved my issue.
I however wanted to leave you and anybody who faces the same issue my solution, so this can be addressed.

@dreams-money
Copy link

This issue seems to be present in the Ergo blockchain's use of leveldb. Can be reproduced if anyone's interested: ergoplatform/ergo#1390

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants