Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MemoryError #2

Open
teknogeek opened this issue Nov 8, 2015 · 5 comments
Open

MemoryError #2

teknogeek opened this issue Nov 8, 2015 · 5 comments

Comments

@teknogeek
Copy link

I seem to be getting a MemoryError when trying to generate a story and im not quite sure how to fix it.

Traceback (most recent call last):
  File "D:/Users/teknogeek/Documents/Code/NeuralStoryteller/test.py", line 3, in <module>
    z = generate.load_all()
  File "D:\Users\teknogeek\Documents\Code\NeuralStoryteller\generate.py", line 105, in load_all
    stv = skipthoughts.load_model(path_to_skmodels, path_to_sktables)
  File "D:\Users\teknogeek\Documents\Code\NeuralStoryteller\skipthoughts.py", line 63, in load_model
    utable, btable = load_tables(path_to_tables)
  File "D:\Users\teknogeek\Documents\Code\NeuralStoryteller\skipthoughts.py", line 81, in load_tables
    utable = numpy.load(path_to_tables + 'utable.npy')
  File "D:\Python27\lib\site-packages\numpy\lib\npyio.py", line 406, in load
    pickle_kwargs=pickle_kwargs)
  File "D:\Python27\lib\site-packages\numpy\lib\format.py", line 638, in read_array
    array = pickle.load(fp, **pickle_kwargs)
MemoryError

The code that i am using to get this is

import generate, os

z = generate.load_all()
generate.story(z, os.path.join("images", "ex1.jpg"))

Any help?

@ryankiros
Copy link
Owner

How much memory do you have available?

This model requires loading 2 large .npy files (utable and btable) both of which are > 2GB. These consist of almost 1 million word embeddings (each of which are 620 dimensional, single precision floats) as part of the skip-thoughts model.

Given that these embeddings are only used for MS COCO training sentences, it's possible I can create a reduced vocabulary version, where only ~30K words are necessary instead of 1M. I'll look into this.

@teknogeek
Copy link
Author

I will test this now. Currently I am starting the test with 8GB of RAM free

@teknogeek
Copy link
Author

still failed with a MemoryError after only getting to 9.4GB used of 16 img

@ryankiros
Copy link
Owner

I haven't done anything different yet. I'll update once I have, probably next week (I've got a paper deadline this week)

@curryli
Copy link

curryli commented Dec 28, 2015

Hi, ryankiros, We have also met the memmory error:

In [2]: z = generate.load_all()
/data/romance.npz
Loading skip-thoughts...
/usr/lib/python2.7/site-packages/theano/scan_module/scan.py:1019: Warning: In the strict mode, all neccessary shared variables must be passed as a part of non_sequences
'must be passed as a part of non_sequences', Warning)
Loading decoder...
Problem occurred during compilation with the command line below:
/usr/bin/g++ -shared -g -O3 -fno-math-errno -Wno-unused-label -Wno-unused-variable -Wno-write-strings -Wl,-rpath,/usr/lib64 -march=nocona -mcx16 -msahf -mno-movbe -maes -mno-pclmul -mpopcnt -mno-abm -mno-lwp -mno-fma -mno-fma4 -mno-xop -mno-bmi -mno-bmi2 -mno-tbm -mno-avx -mno-avx2 -msse4.2 -msse4.1 -mno-lzcnt -mno-rtm -mno-hle -mno-rdrnd -mno-f16c -mno-fsgsbase -mno-rdseed -mno-prfchw -mno-adx -mfxsr -mno-xsave -mno-xsaveopt --param l1-cache-size=32 --param l1-cache-line-size=64 --param l2-cache-size=20480 -mtune=nocona -D NPY_NO_DEPRECATED_API=NPY_1_7_API_VERSION -m64 -fPIC -I/usr/lib64/python2.7/site-packages/numpy/core/include -I/usr/include/python2.7 -I/usr/lib/python2.7/site-packages/theano/gof -fvisibility=hidden -o /root/.theano/compiledir_Linux-3.10-el7.x86_64-x86_64-with-centos-7.0.1406-Core-x86_64-2.7.5-64/tmpSPHNG7/de61f4c333d06430fc7dc014bf04eb25.so /root/.theano/compiledir_Linux-3.10-el7.x86_64-x86_64-with-centos-7.0.1406-Core-x86_64-2.7.5-64/tmpSPHNG7/mod.cpp -L/usr/lib64 -L/usr/lib64 -lpython2.7 -lopenblas
ERROR (theano.gof.cmodule): [Errno 12] Cannot allocate memory

We are running on a machine with 8Gb, but it failed.
I'm wondering how many memory do we need?

Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants