Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak in get_item_vector() ? #124

Closed
Jeffrey04 opened this issue Dec 30, 2015 · 7 comments
Closed

Memory leak in get_item_vector() ? #124

Jeffrey04 opened this issue Dec 30, 2015 · 7 comments

Comments

@Jeffrey04
Copy link

I don't really know how to diagnose memory leak, however I do notice memory usage keep increasing when calling get_item_vector repetitively.

#!/usr/bin/env python3

from annoy import AnnoyIndex
from random import random, sample
from numpy import array

index = AnnoyIndex(500)
collection_count = 1000000

for i in range(collection_count):
    if (i + 1) % 10000 == 0:
        print('Inserting vector #{}'.format(i + 1))

    index.add_item(i, array([random() for _ in range(500)]))

print('building index')
index.build(1)


for i in range(collection_count):
    print('{}: {}'.format(i, len(index.get_item_vector(i))))

However, the memory problem is gone if I do a gc.collect() after each call to get_item_vector does that help?

My installation of annoy

$ pip show annoy

---
Name: annoy
Version: 1.6.2
Location: /home/jeffrey04/machine-learning/lib/python3.4/site-packages
Requires:

Using python 3.4 on Ubuntu 14.04 64bit

$ python --version
Python 3.4.3
$ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description:    Ubuntu 14.04.3 LTS
Release:        14.04
Codename:       trusty
$ uname -a
Linux gideon 4.2.0-19-generic #23~14.04.1-Ubuntu SMP Thu Nov 12 12:33:30 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux
@erikbern
Copy link
Collaborator

does it eventually run out of memory and crash? i'm guessing it's just on the python side that the interpreter allocates memory. maybe it doesn't collect it until needed

@Jeffrey04
Copy link
Author

@erikbern I did not really wait until it crashed in my actual application, as I was running it in a shared server. But the memory usage was maxed out and the server started swapping heavily.

@erikbern
Copy link
Collaborator

I think I've been able to repro is using this:

class MemoryLeakTest(TestCase):
    def test_get_item_vector(self):
        f = 10
        i = AnnoyIndex(f, 'euclidean')
        i.add_item(0, [random.gauss(0, 1) for x in xrange(f)])
        for j in xrange(100 * 1000 * 1000):
            i.get_item_vector(0)

@erikbern
Copy link
Collaborator

great if you can take a look again

@Jeffrey04
Copy link
Author

I will check again next week, thanks for the quick response

and HAPPY NEW YEAR #offtopic

@Jeffrey04
Copy link
Author

oh, it is fixed, thanks (and sorry for taking this long to confirm)

@erikbern
Copy link
Collaborator

np (just published 1.7.0 to pypi with this fix in it)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants