-
Notifications
You must be signed in to change notification settings - Fork 281
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Performance on large datasets #27
Comments
Hi Tony,
Assuming that the server has virtual memory, it'll start paging to disk when the all the ram is used up. This could cause the Go app to run very slowly, between 10-100x slower that reading directly from ram, depending on disk speed. If the server is not using virtual memory then the Go program will panic with an out of memory error. I don't recommend that you do this. If memory is an issue then use an on-disk database like BoltDB or leveldb. For BuntDB you should have 1/4 more memory on the server than the working in-memory dataset to be safe. I suggest loading the data into a test buntdb program and see how much mem it uses.
You'll need to query each key individually. |
Thanks Josh. You helped me clear my mind about what to use. And when buntdb is a good choice. |
You're welcome and best luck. |
How does buntdb perform on data too large to fit into ram? would i have good performance with a dataset of over 5gb (mongodb dump), on a 500mb or 1gig ram server?
I'm currently evaluating buntdb over boltdb. As replacement for a system that currently runs on mongodb. I intend to use bleve search as well for indexing and search. Can I query with an array of keys? or get each key from the array individually?
The text was updated successfully, but these errors were encountered: