Skip to content
This repository has been archived by the owner on Apr 2, 2020. It is now read-only.

hashTable allocating tons of memory #21

Open
manishrjain opened this issue May 17, 2017 · 1 comment
Open

hashTable allocating tons of memory #21

manishrjain opened this issue May 17, 2017 · 1 comment

Comments

@manishrjain
Copy link

$ go tool pprof populate /tmp/profile764028996/mem.pprof
Entering interactive mode (type "help" for commands)
(pprof) list lz4.Encode
Total: 3.90GB
ROUTINE ======================== github.com/bkaradzic/go-lz4.Encode in /home/ubuntu/go/src/github.com/bkaradzic/go-lz4/writer.go
    3.70GB     3.70GB (flat, cum) 94.89% of Total
         .          .    107:   if len(src) >= MaxInputSize {
         .          .    108:           return nil, ErrTooLarge
         .          .    109:   }
         .          .    110:
         .          .    111:   if n := CompressBound(len(src)); len(dst) < n {
    8.52MB     8.52MB    112:           dst = make([]byte, n)
         .          .    113:   }
         .          .    114:
    3.69GB     3.69GB    115:   e := encoder{src: src, dst: dst, hashTable: make([]uint32, hashTableSize)}
         .          .    116:
         .          .    117:   binary.LittleEndian.PutUint32(dst, uint32(len(src)))
         .          .    118:   e.dpos = 4
         .          .    119:
         .          .    120:   var (

This line in the code is causing Badger to OOM when loading data really fast. Ideally, you want to reuse the same hashTable. Can be done via sync.Pool. Happy to send a PR if that'd help.

@manishrjain manishrjain changed the title hashTable consuming shit ton of memory hashTable allocating tons of memory May 17, 2017
@dgryski
Copy link
Collaborator

dgryski commented May 17, 2017

I agree sync.Pool() will help here. Please send a PR.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants