-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Description
What version of Go are you using (go version)?
$ go version go version go1.14 linux/amd64
What version of Badger are you using?
v1.6.0
Does this issue reproduce with the latest master?
As far as I know, yes.
What are the hardware specifications of the machine (RAM, OS, Disk)?
32 GB RAM
AMD Ryzen 9 3900X 12-core, 24-Thread
1 TB Samsung SSD
What did you do?
Used the default options to populate a table with about 1000 key/val pairs where each value is roughly 30MB.
The badger database directory is 101GB according to du. There are 84 .vlog files.
When I start my server up, it quickly consumes 10 GB of ram and dies due to OOM. dmesg output:
[654397.093709] Out of memory: Killed process 15281 (taskserver) total-vm:20565228kB, anon-rss:12610116kB, file-rss:0kB, shmem-rss:0kB
What did you expect to see?
I would expect the database to provide a simple option to limit memory usage to an approximate cap.
What did you see instead?
-
The recommended mechanism of tweaking a many-dimension parameter space is confusing and hasn't worked for me.
-
The memory related parameters are not explained in much detail. For example, the docstring for options.MemoryMap doen't indicate roughly how expensive MemoryMap is vs FileIO.
-
I haven't managed to successfully reduce memory usage using the following parameters:
func opts(dbPath string) badger.Options {
return badger.DefaultOptions(dbPath).
WithValueLogLoadingMode(options.FileIO).
WithTableLoadingMode(options.FileIO).
WithNumMemtables(1)
}I can create an example program if the issue is of interest.