You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I run the code ./tools/rotation_demo.py, I get the following error:
Loaded network /RRPN/data/faster_rcnn_models/vgg16_faster_rcnn.caffemodel
Memory need is 426752000
Memory need is 426752000
Memory need is 106752000
Memory need is 106752000
Memory need is 213504000
Memory need is 213504000
Memory need is 53376000
Memory need is 53376000
Memory need is 106752000
F1010 11:27:03.680461 8528 syncedmem.cpp:57] Check failed: error == cudaSuccess (2 vs. 0) out of memory
*** Check failure stack trace: ***
Aborted (core dumped)
I have tried all possible available solution but they were not able to resolve this. I am compiling with Cudnn 7 and cuda 9.0. I have downgraded both of them and the problem was still not solved. I am using GT 710 2GB. Is there a way anyone can help me here. I am not even sure that if this is a bug or genuinely a hardware limitation. So before I go and buy a new GPU I would appreciate your help.
When I run the code
./tools/rotation_demo.py
, I get the following error:Loaded network /RRPN/data/faster_rcnn_models/vgg16_faster_rcnn.caffemodel
Memory need is 426752000
Memory need is 426752000
Memory need is 106752000
Memory need is 106752000
Memory need is 213504000
Memory need is 213504000
Memory need is 53376000
Memory need is 53376000
Memory need is 106752000
F1010 11:27:03.680461 8528 syncedmem.cpp:57] Check failed: error == cudaSuccess (2 vs. 0) out of memory
*** Check failure stack trace: ***
Aborted (core dumped)
I have tried all possible available solution but they were not able to resolve this. I am compiling with Cudnn 7 and cuda 9.0. I have downgraded both of them and the problem was still not solved. I am using GT 710 2GB. Is there a way anyone can help me here. I am not even sure that if this is a bug or genuinely a hardware limitation. So before I go and buy a new GPU I would appreciate your help.
@mjq11302010044
@idefix92
The text was updated successfully, but these errors were encountered: