Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Check failed: error == cudaSuccess (18 vs. 0) invalid texture reference #34

Closed
dongb5 opened this issue Jan 11, 2017 · 2 comments
Closed

Comments

@dongb5
Copy link

dongb5 commented Jan 11, 2017

I0111 12:25:53.896850 4324 net.cpp:235] This network produces output proposal_cls_prob
I0111 12:25:53.896872 4324 net.cpp:492] Collecting Learning Rate and Weight Decay.
I0111 12:25:53.896881 4324 net.cpp:247] Network initialization done.
I0111 12:25:53.896886 4324 net.cpp:248] Memory required for data: 162768800
I0111 12:25:53.986716 4324 net.cpp:746] Copying source layer conv1_1
I0111 12:25:53.986763 4324 net.cpp:746] Copying source layer relu1_1
I0111 12:25:53.986780 4324 net.cpp:746] Copying source layer conv1_2
I0111 12:25:53.986812 4324 net.cpp:746] Copying source layer relu1_2
I0111 12:25:53.986820 4324 net.cpp:746] Copying source layer pool1
I0111 12:25:53.986826 4324 net.cpp:746] Copying source layer conv2_1
I0111 12:25:53.986872 4324 net.cpp:746] Copying source layer relu2_1
I0111 12:25:53.986881 4324 net.cpp:746] Copying source layer conv2_2
I0111 12:25:53.986966 4324 net.cpp:746] Copying source layer relu2_2
I0111 12:25:53.986974 4324 net.cpp:746] Copying source layer pool2
I0111 12:25:53.986979 4324 net.cpp:746] Copying source layer conv3_1
I0111 12:25:53.987136 4324 net.cpp:746] Copying source layer relu3_1
I0111 12:25:53.987143 4324 net.cpp:746] Copying source layer conv3_2
I0111 12:25:53.987453 4324 net.cpp:746] Copying source layer relu3_2
I0111 12:25:53.987463 4324 net.cpp:746] Copying source layer conv3_3
I0111 12:25:53.987776 4324 net.cpp:746] Copying source layer relu3_3
I0111 12:25:53.987784 4324 net.cpp:746] Copying source layer pool3
I0111 12:25:53.987789 4324 net.cpp:746] Copying source layer conv4_1
I0111 12:25:53.988402 4324 net.cpp:746] Copying source layer relu4_1
I0111 12:25:53.988414 4324 net.cpp:746] Copying source layer conv4_2
I0111 12:25:53.989629 4324 net.cpp:746] Copying source layer relu4_2
I0111 12:25:53.989645 4324 net.cpp:746] Copying source layer conv4_3
I0111 12:25:53.990911 4324 net.cpp:746] Copying source layer relu4_3
I0111 12:25:53.990929 4324 net.cpp:746] Copying source layer pool4
I0111 12:25:53.990934 4324 net.cpp:746] Copying source layer conv5_1
I0111 12:25:53.992154 4324 net.cpp:746] Copying source layer relu5_1
I0111 12:25:53.992173 4324 net.cpp:746] Copying source layer conv5_2
I0111 12:25:53.993454 4324 net.cpp:746] Copying source layer relu5_2
I0111 12:25:53.993474 4324 net.cpp:746] Copying source layer conv5_3
I0111 12:25:53.994707 4324 net.cpp:746] Copying source layer relu5_3
I0111 12:25:53.994727 4324 net.cpp:746] Copying source layer conv_proposal1
I0111 12:25:53.995934 4324 net.cpp:746] Copying source layer relu_proposal1
I0111 12:25:53.995954 4324 net.cpp:746] Copying source layer conv_proposal1_relu_proposal1_0_split
I0111 12:25:53.995959 4324 net.cpp:746] Copying source layer proposal_cls_score
I0111 12:25:53.995973 4324 net.cpp:746] Copying source layer proposal_bbox_pred
I0111 12:25:53.995995 4324 net.cpp:746] Copying source layer proposal_cls_score_reshape
I0111 12:25:53.996001 4324 net.cpp:743] Ignoring source layer proposal_cls_score_reshape_proposal_cls_score_reshape_0_split
I0111 12:25:53.996007 4324 net.cpp:743] Ignoring source layer labels_reshape
I0111 12:25:53.996012 4324 net.cpp:743] Ignoring source layer labels_reshape_labels_reshape_0_split
I0111 12:25:53.996017 4324 net.cpp:743] Ignoring source layer labels_weights_reshape
I0111 12:25:53.996022 4324 net.cpp:743] Ignoring source layer loss
I0111 12:25:53.996027 4324 net.cpp:743] Ignoring source layer accuarcy
I0111 12:25:53.996031 4324 net.cpp:743] Ignoring source layer loss_bbox
I0111 12:25:53.997133 4324 net.cpp:42] Initializing net from parameters:
name: "VGG_ILSVRC_16"
input: "data1"
input: "data2"
input: "rois"
input_dim: 1
input_dim: 512
input_dim: 50
input_dim: 50
input_dim: 1
input_dim: 512
input_dim: 50
input_dim: 50
input_dim: 1
input_dim: 5
input_dim: 1
input_dim: 1
state {
phase: TEST
}
layer {
name: "roi_pool3"
type: "ROIPooling"
bottom: "data1"
bottom: "rois"
top: "roi_pool3"
roi_pooling_param {
pooled_h: 7
pooled_w: 7
spatial_scale: 0.25
}
}
layer {
name: "roi_pool4_3"
type: "ROIPooling"
bottom: "data2"
bottom: "rois"
top: "roi_pool4_3"
roi_pooling_param {
pooled_h: 7
pooled_w: 7
spatial_scale: 0.25
}
}
layer {
name: "roi_pool3_flat"
type: "Flatten"
bottom: "roi_pool3"
top: "roi_pool3_flat"
}
layer {
name: "roi_pool4_3_flat"
type: "Flatten"
bottom: "roi_pool4_3"
top: "roi_pool4_3_flat"
}
layer {
name: "concat_feat"
type: "Concat"
bottom: "roi_pool3_flat"
bottom: "roi_pool4_3_flat"
top: "concat_feat"
}
I0111 12:25:53.997277 4324 net.cpp:380] Input 0 -> data1
I0111 12:25:53.997287 4324 net.cpp:380] Input 1 -> data2
I0111 12:25:53.997294 4324 net.cpp:380] Input 2 -> rois
I0111 12:25:53.997303 4324 layer_factory.hpp:74] Creating layer rois_input_2_split
I0111 12:25:53.997311 4324 net.cpp:90] Creating Layer rois_input_2_split
I0111 12:25:53.997316 4324 net.cpp:420] rois_input_2_split <- rois
I0111 12:25:53.997323 4324 net.cpp:378] rois_input_2_split -> rois_input_2_split_0
I0111 12:25:53.997334 4324 net.cpp:378] rois_input_2_split -> rois_input_2_split_1
I0111 12:25:53.997344 4324 net.cpp:120] Setting up rois_input_2_split
I0111 12:25:53.997352 4324 net.cpp:127] Top shape: 1 5 1 1 (5)
I0111 12:25:53.997359 4324 net.cpp:127] Top shape: 1 5 1 1 (5)
I0111 12:25:53.997364 4324 layer_factory.hpp:74] Creating layer roi_pool3
I0111 12:25:53.997371 4324 net.cpp:90] Creating Layer roi_pool3
I0111 12:25:53.997377 4324 net.cpp:420] roi_pool3 <- data1
I0111 12:25:53.997383 4324 net.cpp:420] roi_pool3 <- rois_input_2_split_0
I0111 12:25:53.997390 4324 net.cpp:378] roi_pool3 -> roi_pool3
I0111 12:25:53.997397 4324 net.cpp:120] Setting up roi_pool3
I0111 12:25:53.997406 4324 roi_pooling_layer.cpp:44] Spatial scale: 0.25
I0111 12:25:53.997417 4324 net.cpp:127] Top shape: 1 512 7 7 (25088)
I0111 12:25:53.997423 4324 layer_factory.hpp:74] Creating layer roi_pool4_3
I0111 12:25:53.997429 4324 net.cpp:90] Creating Layer roi_pool4_3
I0111 12:25:53.997434 4324 net.cpp:420] roi_pool4_3 <- data2
I0111 12:25:53.997440 4324 net.cpp:420] roi_pool4_3 <- rois_input_2_split_1
I0111 12:25:53.997447 4324 net.cpp:378] roi_pool4_3 -> roi_pool4_3
I0111 12:25:53.997453 4324 net.cpp:120] Setting up roi_pool4_3
I0111 12:25:53.997459 4324 roi_pooling_layer.cpp:44] Spatial scale: 0.25
I0111 12:25:53.997467 4324 net.cpp:127] Top shape: 1 512 7 7 (25088)
I0111 12:25:53.997473 4324 layer_factory.hpp:74] Creating layer roi_pool3_flat
I0111 12:25:53.997480 4324 net.cpp:90] Creating Layer roi_pool3_flat
I0111 12:25:53.997486 4324 net.cpp:420] roi_pool3_flat <- roi_pool3
I0111 12:25:53.997493 4324 net.cpp:378] roi_pool3_flat -> roi_pool3_flat
I0111 12:25:53.997499 4324 net.cpp:120] Setting up roi_pool3_flat
I0111 12:25:53.997509 4324 net.cpp:127] Top shape: 1 25088 (25088)
I0111 12:25:53.997514 4324 layer_factory.hpp:74] Creating layer roi_pool4_3_flat
I0111 12:25:53.997521 4324 net.cpp:90] Creating Layer roi_pool4_3_flat
I0111 12:25:53.997527 4324 net.cpp:420] roi_pool4_3_flat <- roi_pool4_3
I0111 12:25:53.997534 4324 net.cpp:378] roi_pool4_3_flat -> roi_pool4_3_flat
I0111 12:25:53.997541 4324 net.cpp:120] Setting up roi_pool4_3_flat
I0111 12:25:53.997550 4324 net.cpp:127] Top shape: 1 25088 (25088)
I0111 12:25:53.997555 4324 layer_factory.hpp:74] Creating layer concat_feat
I0111 12:25:53.997562 4324 net.cpp:90] Creating Layer concat_feat
I0111 12:25:53.997568 4324 net.cpp:420] concat_feat <- roi_pool3_flat
I0111 12:25:53.997573 4324 net.cpp:420] concat_feat <- roi_pool4_3_flat
I0111 12:25:53.997581 4324 net.cpp:378] concat_feat -> concat_feat
I0111 12:25:53.997587 4324 net.cpp:120] Setting up concat_feat
I0111 12:25:53.997596 4324 net.cpp:127] Top shape: 1 50176 (50176)
I0111 12:25:53.997601 4324 net.cpp:194] concat_feat does not need backward computation.
I0111 12:25:53.997606 4324 net.cpp:194] roi_pool4_3_flat does not need backward computation.
I0111 12:25:53.997612 4324 net.cpp:194] roi_pool3_flat does not need backward computation.
I0111 12:25:53.997617 4324 net.cpp:194] roi_pool4_3 does not need backward computation.
I0111 12:25:53.997623 4324 net.cpp:194] roi_pool3 does not need backward computation.
I0111 12:25:53.997629 4324 net.cpp:194] rois_input_2_split does not need backward computation.
I0111 12:25:53.997635 4324 net.cpp:235] This network produces output concat_feat
I0111 12:25:53.997644 4324 net.cpp:492] Collecting Learning Rate and Weight Decay.
I0111 12:25:53.997650 4324 net.cpp:247] Network initialization done.
I0111 12:25:53.997654 4324 net.cpp:248] Memory required for data: 602152
F0111 12:25:55.269489 4324 relu_layer.cu:27] Check failed: error == cudaSuccess (18 vs. 0) invalid texture reference
*** Check failure stack trace: ***
Killed

When I tried to run 'script_rpn_bf_pedestrian_VGG16_caltech_demo.m' I got the above error. I had compiled MS version caffe and copy the mex file to the path. And I used Titan X as gpu and Matlab 2013b. Any idea for this error?? Thanks.

@zhangliliang
Copy link
Owner

Hi,

This code has been tested on Ubuntu 14.04 with MATLAB 2014b and CUDA 7.5 using Titan X or Tesla K40.

You might try to use MATLAB 2014b or check your cuda version.

@dongb5
Copy link
Author

dongb5 commented Jan 11, 2017

Thank you. I have solved the problem by finding another PC with MATLAB 2014b ...

@dongb5 dongb5 closed this as completed Jan 11, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants