You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
loading annotations into memory...
0:00:00.743513
creating index...
index created!
Loading and preparing results...
DONE (t=0.07s)
creating index...
index created!
tokenization...
Traceback (most recent call last):
File "/home/chenzhanghui/.pycharm_helpers/pydev/pydevd.py", line 1741, in
main()
File "/home/chenzhanghui/.pycharm_helpers/pydev/pydevd.py", line 1735, in main
globals = debugger.run(setup['file'], None, None, is_module)
File "/home/chenzhanghui/.pycharm_helpers/pydev/pydevd.py", line 1135, in run
pydev_imports.execfile(file, globals, locals) # execute the script
File "/home/chenzhanghui/.pycharm_helpers/pydev/_pydev_imps/_pydev_execfile.py", line 18, in execfile
exec(compile(contents+"\n", file, 'exec'), glob, loc)
File "/home/chenzhanghui/code/showEditAndTell/editnet.py", line 828, in
word_map = word_map)
File "/home/chenzhanghui/code/showEditAndTell/editnet.py", line 727, in evaluate
cocoEval.evaluate()
File "coco-caption/pycocoevalcap/eval.py", line 36, in evaluate
gts = tokenizer.tokenize(gts)
File "coco-caption/pycocoevalcap/tokenizer/ptbtokenizer.py", line 54, in tokenize
stdout=subprocess.PIPE)
File "/home/chenzhanghui/anaconda3/envs/py36/lib/python3.6/subprocess.py", line 729, in init
restore_signals, start_new_session)
File "/home/chenzhanghui/anaconda3/envs/py36/lib/python3.6/subprocess.py", line 1295, in _execute_child
restore_signals, start_new_session, preexec_fn)
File "/home/chenzhanghui/.pycharm_helpers/pydev/_pydev_bundle/pydev_monkey.py", line 424, in new_fork_exec
return getattr(_posixsubprocess, original_name)(args, *other_args)
OSError: [Errno 12] Cannot allocate memory
When running the editnet.py, the error occurs.
Mostly because of this line of code in the class PTBTokenizer.
p_tokenizer = subprocess.Popen(cmd, cwd=path_to_jar_dirname,
stdout=subprocess.PIPE) But I don't know how to solve it....
Can you help me with this?
When I run the dcnet.py, it can work normally !!!
The text was updated successfully, but these errors were encountered:
Hi. The METEOR metric creates another subprocess each time its called, and does not destroy the previous one. Therefore, each time you call METEOR, memory increases. Although this is a rare phenomenon, it can still happen. To solve this, add self.meteor_p.kill() after line 44 in meteor.py. Hope this helps. Feel free to re-open this issue if it didn't work.
Calculating Evalaution Metric Scores......
loading annotations into memory...
0:00:00.743513
creating index...
index created!
Loading and preparing results...
DONE (t=0.07s)
creating index...
index created!
tokenization...
Traceback (most recent call last):
File "/home/chenzhanghui/.pycharm_helpers/pydev/pydevd.py", line 1741, in
main()
File "/home/chenzhanghui/.pycharm_helpers/pydev/pydevd.py", line 1735, in main
globals = debugger.run(setup['file'], None, None, is_module)
File "/home/chenzhanghui/.pycharm_helpers/pydev/pydevd.py", line 1135, in run
pydev_imports.execfile(file, globals, locals) # execute the script
File "/home/chenzhanghui/.pycharm_helpers/pydev/_pydev_imps/_pydev_execfile.py", line 18, in execfile
exec(compile(contents+"\n", file, 'exec'), glob, loc)
File "/home/chenzhanghui/code/showEditAndTell/editnet.py", line 828, in
word_map = word_map)
File "/home/chenzhanghui/code/showEditAndTell/editnet.py", line 727, in evaluate
cocoEval.evaluate()
File "coco-caption/pycocoevalcap/eval.py", line 36, in evaluate
gts = tokenizer.tokenize(gts)
File "coco-caption/pycocoevalcap/tokenizer/ptbtokenizer.py", line 54, in tokenize
stdout=subprocess.PIPE)
File "/home/chenzhanghui/anaconda3/envs/py36/lib/python3.6/subprocess.py", line 729, in init
restore_signals, start_new_session)
File "/home/chenzhanghui/anaconda3/envs/py36/lib/python3.6/subprocess.py", line 1295, in _execute_child
restore_signals, start_new_session, preexec_fn)
File "/home/chenzhanghui/.pycharm_helpers/pydev/_pydev_bundle/pydev_monkey.py", line 424, in new_fork_exec
return getattr(_posixsubprocess, original_name)(args, *other_args)
OSError: [Errno 12] Cannot allocate memory
When running the editnet.py, the error occurs.
Mostly because of this line of code in the class PTBTokenizer.
p_tokenizer = subprocess.Popen(cmd, cwd=path_to_jar_dirname,
stdout=subprocess.PIPE)
But I don't know how to solve it....
Can you help me with this?
When I run the dcnet.py, it can work normally !!!
The text was updated successfully, but these errors were encountered: