New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bad_alloc error on RPi #5164
Comments
Maybe we should update OptimizeForSmallDb() but I think if we want to limit the memory usage, it's a good idea to set following:
And I hope it will make things better. To debug where the memory is used, https://github.com/facebook/rocksdb/blob/master/include/rocksdb/utilities/memory_util.h#L20-L48 is something you can try out. It doesn't cover everything, but a good portion. |
My previous comment has a typo. I already edited it in-place. It should be |
I'm also trying to change the behavior of OptimizeForSmallDb() for future releases. |
Thanks for your fast response. I will try out your suggested parameters today and run for few hours; will post back once test is done. |
Try out recommended table option; but it doesn't work out. I still got bad_alloc error in my log. "To debug where the memory is used, https://github.com/facebook/rocksdb/blob/master/include/rocksdb/utilities/memory_util.h#L20-L48 is something you can try out." Do you mean I update the code to call these method? why not use dump statistics? What should I look at first in the statistic dumps? |
@philipktlin memory dump will be even better! |
I am new to RPi; do you know how to do memory dump on RPi? |
Expected behavior
Expect to run without exceptions
Actual behavior
Got std::bad_alloc error
Configuration
// Optimization functions (
rocksdb/options/options.cc
Line 468 in 313e877
DBOptions* DBOptions::OptimizeForSmallDb() {
max_file_opening_threads = 1;
max_open_files = 5000;
return this;
}
ColumnFamilyOptions* ColumnFamilyOptions::OptimizeForSmallDb() {
write_buffer_size = 2 << 20;
target_file_size_base = 2 * 1048576;
max_bytes_for_level_base = 10 * 1048576;
soft_pending_compaction_bytes_limit = 256 * 1048576;
hard_pending_compaction_bytes_limit = 1073741824ul;
return this;
}
Questions
The text was updated successfully, but these errors were encountered: