Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The program is not memory optimized. #3

Open
JafferWilson opened this issue Sep 13, 2017 · 21 comments
Open

The program is not memory optimized. #3

JafferWilson opened this issue Sep 13, 2017 · 21 comments

Comments

@JafferWilson
Copy link

Hello,
I am trying to run your repository. I tried with 16 GB, 32 GB, 40 GB, 120 GB RAM systems. I do not understand, the pre-process is taken a lot of memory. At 120 GB RAM, first time I can across Memory related issue. Else everytime I tried the process got killed.

Kindly, let me know what was your configuration for running this process. Kindly, add the details of your system so that it will be helpful to me while running the repository.

@JafferWilson
Copy link
Author

I increased the RAM to 480 GB.. still the pre-process show process killed.
Is it possible for you to make the pre-processed data available in the repository?

@JafferWilson
Copy link
Author

JafferWilson commented Sep 21, 2017

Can you please answer my queries, it will help for sure. Waiting for your reply.

@fievelk
Copy link

fievelk commented Oct 17, 2017

I confirm the issue.
@JafferWilson did you find a way to make it run?

@JafferWilson
Copy link
Author

@fievelk Yes. The way it is shown in the Read.me file. It is the same way I ran the code.

@fievelk
Copy link

fievelk commented Oct 18, 2017

@JafferWilson Sorry, I did not formulate my question correctly. Running the code using the instructions in the README still produces these memory issues and the process gets killed.
Did you manage to fix the problem somehow?

@JafferWilson
Copy link
Author

@fievelk Well No... I do not understand why the process is taking so much of Memory. As I have mentioned in the issues what experiment I did and still empty handed.

@kapardine
Copy link

kapardine commented Oct 31, 2017

@JafferWilson
Please copy the following code instead of the one given here for module load_bin_vec(fname,vocab). This should resolve the issue.

def load_bin_vec(fname, vocab):
"""
Loads 300x1 word vecs from Google (Mikolov) word2vec
"""
word_vecs = {}
with open(fname, "rb") as f:
header = f.readline()
vocab_size, layer1_size = map(int, header.split())
binary_len = np.dtype(theano.config.floatX).itemsize * layer1_size
for line in xrange(vocab_size):
word = []
ch = f.read(1)
if ch == ' ':
word = ''.join(word)
break
if ch != '\n':
word.append(ch)
if tuple(word) in vocab:
word_vecs[tuple(word)] = np.fromstring(f.read(binary_len), dtype=theano.config.floatX)
else:
f.read(binary_len)
return word_vecs

@naikzinal
Copy link

dear @JafferWilson can you slove the problem by using this code?

@JafferWilson
Copy link
Author

@naikzinal Sure I will. Just having another problems to solve. As soon free I will.

@naikzinal
Copy link

" name 'load_bin_vec' is not defined"
i found that error after changing code can you please help me
thank you

@kapardine
Copy link

kapardine commented Nov 7, 2017 via email

@naikzinal
Copy link

dear,@chaisme i solved my naming error but, i still have a memory issue. and i don't get any attachment from you. if you are able to run code then can you please send me your process_data.py file. if you can please send me. and what system requirement is needed for run this code?
thank you

@kapardine
Copy link

kapardine commented Nov 7, 2017

Here is the file attached in txt format. Please convert this to python script. No new system requirements needed except the ones already mentioned in the README.
process_data.txt

@naikzinal
Copy link

dear @chaisme ,Thank you for rly. i will try as soon as i can. and here is my eamil_id naikzinal69@gmail.com you can mail me on that id.
thank you

@JafferWilson
Copy link
Author

@naikzinal Why you want it on your email, where as you can download it from here always? Or you can download it now and then upload it on your side.

@kapardine
Copy link

@naikzinal @JafferWilson I have uploaded the txt file in the above comment. Use it as a python script.

@naikzinal
Copy link

dear @JafferWilson actually i changed the code bt still i have memory isseue thats why i asked for file.now i can run my code.

@roysoumya
Copy link

Initially showed process killed but ran perfectly using the code of @chaisme . Thank you very much.

@jennaniven
Copy link

Hi there,

I am trying to run this app and I seem to get stuck at the training phase:

python conv_net_train.py -static -word2vec 2
loading data... data loaded!
model architecture: CNN-static
using: word2vec vectors
[('image shape', 153, 300), ('filter shape', [(200, 1, 1, 300), (200, 1, 2, 300), (200, 1, 3, 300)]), ('hidden_units', [200, 200, 2]), ('dropout', [0.5, 0.5, 0.5]), ('batch_size', 50), ('non_static', False), ('learn_decay', 0.95), ('conv_non_linear', 'relu'), ('non_static', False), ('sqr_norm_lim', 9), ('shuffle_batch', True)]
... training

When I interrupt the kernel I get:

Traceback (most recent call last):
  File "conv_net_train.py", line 476, in <module>
    activations=[Sigmoid])
  File "conv_net_train.py", line 221, in train_conv_net
    cost_epoch = train_model(minibatch_index)
  File "/anaconda3/envs/py27/lib/python2.7/site-packages/theano/compile/function_module.py", line 903, in __call__
    self.fn() if output_subset is None else\
  File "/anaconda3/envs/py27/lib/python2.7/site-packages/theano/scan_module/scan_op.py", line 963, in rval
    r = p(n, [x[0] for x in i], o)
  File "/anaconda3/envs/py27/lib/python2.7/site-packages/theano/scan_module/scan_op.py", line 952, in p
    self, node)
  File "theano/scan_module/scan_perform.pyx", line 397, in theano.scan_module.scan_perform.perform (/Users/jennan/.theano/compiledir_Darwin-16.7.0-x86_64-i386-64bit-i386-2.7.15-64/scan_perform/mod.cpp:4490)
  File "/anaconda3/envs/py27/lib/python2.7/site-packages/theano/scan_module/scan_op.py", line 961, in rval
    def rval(p=p, i=node_input_storage, o=node_output_storage, n=node,
KeyboardInterrupt

Any help would be greatly appreciated!!

@vivekraghu17
Copy link

File "conv_net_train.py", line 147, in train_conv_net
train_set_x = datasets[0][rand_perm]
MemoryError

Please someone help I need the soln asap

@CyraxSector
Copy link

Hi there,

I am trying to run this app and I seem to get stuck at the training phase:

python conv_net_train.py -static -word2vec 2
loading data... data loaded!
model architecture: CNN-static
using: word2vec vectors
[('image shape', 153, 300), ('filter shape', [(200, 1, 1, 300), (200, 1, 2, 300), (200, 1, 3, 300)]), ('hidden_units', [200, 200, 2]), ('dropout', [0.5, 0.5, 0.5]), ('batch_size', 50), ('non_static', False), ('learn_decay', 0.95), ('conv_non_linear', 'relu'), ('non_static', False), ('sqr_norm_lim', 9), ('shuffle_batch', True)]
... training

When I interrupt the kernel I get:

Traceback (most recent call last):
  File "conv_net_train.py", line 476, in <module>
    activations=[Sigmoid])
  File "conv_net_train.py", line 221, in train_conv_net
    cost_epoch = train_model(minibatch_index)
  File "/anaconda3/envs/py27/lib/python2.7/site-packages/theano/compile/function_module.py", line 903, in __call__
    self.fn() if output_subset is None else\
  File "/anaconda3/envs/py27/lib/python2.7/site-packages/theano/scan_module/scan_op.py", line 963, in rval
    r = p(n, [x[0] for x in i], o)
  File "/anaconda3/envs/py27/lib/python2.7/site-packages/theano/scan_module/scan_op.py", line 952, in p
    self, node)
  File "theano/scan_module/scan_perform.pyx", line 397, in theano.scan_module.scan_perform.perform (/Users/jennan/.theano/compiledir_Darwin-16.7.0-x86_64-i386-64bit-i386-2.7.15-64/scan_perform/mod.cpp:4490)
  File "/anaconda3/envs/py27/lib/python2.7/site-packages/theano/scan_module/scan_op.py", line 961, in rval
    def rval(p=p, i=node_input_storage, o=node_output_storage, n=node,
KeyboardInterrupt

Any help would be greatly appreciated!!

Same here. Anyone regarding this, your recommendation would be highly appreciated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants