Skip to content

Provide simple option for limiting total memory usage #1268

@gonzojive

Description

@gonzojive

What version of Go are you using (go version)?

$ go version
go version go1.14 linux/amd64

What version of Badger are you using?

v1.6.0

Does this issue reproduce with the latest master?

As far as I know, yes.

What are the hardware specifications of the machine (RAM, OS, Disk)?

32 GB RAM
AMD Ryzen 9 3900X 12-core, 24-Thread
1 TB Samsung SSD

What did you do?

Used the default options to populate a table with about 1000 key/val pairs where each value is roughly 30MB.

The badger database directory is 101GB according to du. There are 84 .vlog files.

When I start my server up, it quickly consumes 10 GB of ram and dies due to OOM. dmesg output:

[654397.093709] Out of memory: Killed process 15281 (taskserver) total-vm:20565228kB, anon-rss:12610116kB, file-rss:0kB, shmem-rss:0kB

What did you expect to see?

I would expect the database to provide a simple option to limit memory usage to an approximate cap.

What did you see instead?

  1. The recommended mechanism of tweaking a many-dimension parameter space is confusing and hasn't worked for me.

  2. The memory related parameters are not explained in much detail. For example, the docstring for options.MemoryMap doen't indicate roughly how expensive MemoryMap is vs FileIO.

  3. I haven't managed to successfully reduce memory usage using the following parameters:

func opts(dbPath string) badger.Options {
	return badger.DefaultOptions(dbPath).
		WithValueLogLoadingMode(options.FileIO).
		WithTableLoadingMode(options.FileIO).
		WithNumMemtables(1)
}

I can create an example program if the issue is of interest.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions