Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

encountered "out of memory" when run script_rpn_bf_pedestrian_VGG16_caltech_demo #33

Open
TaoZheng opened this issue Jan 8, 2017 · 1 comment

Comments

@TaoZheng
Copy link

TaoZheng commented Jan 8, 2017

Hi,
Thank you for your sharing~
When I run script_rpn_bf_pedestrian_VGG16_caltech_demo to see the detection results,I have encountered the following problems.(Remarks: failed to use cudnn,so I modified the Makefile.config to disable the cudnn before compling).
I0108 18:23:22.370462 4073 net.cpp:90] Creating Layer roi_pool4_3_flat
I0108 18:23:22.370471 4073 net.cpp:420] roi_pool4_3_flat <- roi_pool4_3
I0108 18:23:22.370479 4073 net.cpp:378] roi_pool4_3_flat -> roi_pool4_3_flat
I0108 18:23:22.370491 4073 net.cpp:120] Setting up roi_pool4_3_flat
I0108 18:23:22.370501 4073 net.cpp:127] Top shape: 1 25088 (25088)
I0108 18:23:22.370507 4073 layer_factory.hpp:74] Creating layer concat_feat
I0108 18:23:22.370517 4073 net.cpp:90] Creating Layer concat_feat
I0108 18:23:22.370523 4073 net.cpp:420] concat_feat <- roi_pool3_flat
I0108 18:23:22.370532 4073 net.cpp:420] concat_feat <- roi_pool4_3_flat
I0108 18:23:22.370542 4073 net.cpp:378] concat_feat -> concat_feat
I0108 18:23:22.370550 4073 net.cpp:120] Setting up concat_feat
I0108 18:23:22.370561 4073 net.cpp:127] Top shape: 1 50176 (50176)
I0108 18:23:22.370568 4073 net.cpp:194] concat_feat does not need backward computation.
I0108 18:23:22.370576 4073 net.cpp:194] roi_pool4_3_flat does not need backward computation.
I0108 18:23:22.370584 4073 net.cpp:194] roi_pool3_flat does not need backward computation.
I0108 18:23:22.370590 4073 net.cpp:194] roi_pool4_3 does not need backward computation.
I0108 18:23:22.370597 4073 net.cpp:194] roi_pool3 does not need backward computation.
I0108 18:23:22.370605 4073 net.cpp:194] rois_input_2_split does not need backward computation.
I0108 18:23:22.370612 4073 net.cpp:235] This network produces output concat_feat
I0108 18:23:22.370625 4073 net.cpp:492] Collecting Learning Rate and Weight Decay.
I0108 18:23:22.370632 4073 net.cpp:247] Network initialization done.
I0108 18:23:22.370640 4073 net.cpp:248] Memory required for data: 602152
F0108 18:23:27.069090 4073 syncedmem.cpp:51] Check failed: error == cudaSuccess (2 vs. 0) out of memory
*** Check failure stack trace: ***
Killed

Is it caused by the disable cudnn?How should I fix this problem? Any suggestions are very grateful!Thank you very much.

@zhangliliang
Copy link
Owner

The VGG model needs lots of GPU memory. Thus you might need a titanx or k40 when you disable the cuDNN.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants