Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Check failure stack trace #824

Closed
trundleyrg opened this issue May 25, 2018 · 1 comment
Closed

Check failure stack trace #824

trundleyrg opened this issue May 25, 2018 · 1 comment

Comments

@trundleyrg
Copy link

when I run train_faster_rcnn_alt_opt.py,error occurs:

  • echo Logging output to experiments/logs/faster_rcnn_alt_opt_VGG16_.txt.2018-05-25_16-26-43
    Logging output to experiments/logs/faster_rcnn_alt_opt_VGG16_.txt.2018-05-25_16-26-43
  • ./tools/train_faster_rcnn_alt_opt.py --gpu 0 --net_name VGG16 --weights data/imagenet_models/VGG16.v2.caffemodel --imdb voc_2007_trainval --cfg experiments/cfgs/faster_rcnn_alt_opt.yml
    Called with args:
    Namespace(cfg_file='experiments/cfgs/faster_rcnn_alt_opt.yml', gpu_id=0, imdb_name='voc_2007_trainval', net_name='VGG16', pretrained_model='data/imagenet_models/VGG16.v2.caffemodel', set_cfgs=None)
Stage 1 RPN, init from ImageNet model

Init model: data/imagenet_models/VGG16.v2.caffemodel
Using config:
{'DATA_DIR': 'C:\Program Files\py-faster-rcnn\data',
'DEDUP_BOXES': 0.0625,
'EPS': 1e-14,
'EXP_DIR': 'faster_rcnn_alt_opt',
'GPU_ID': 0,
'MATLAB': 'matlab',
'MODELS_DIR': 'C:\Program Files\py-faster-rcnn\models\pascal_voc',
'PIXEL_MEANS': array([[[102.9801, 115.9465, 122.7717]]]),
'RNG_SEED': 3,
'ROOT_DIR': 'C:\Program Files\py-faster-rcnn',
'TEST': {'BBOX_REG': True,
'HAS_RPN': True,
'MAX_SIZE': 1000,
'NMS': 0.3,
'PROPOSAL_METHOD': 'selective_search',
'RPN_MIN_SIZE': 16,
'RPN_NMS_THRESH': 0.7,
'RPN_POST_NMS_TOP_N': 300,
'RPN_PRE_NMS_TOP_N': 6000,
'SCALES': [600],
'SVM': False},
'TRAIN': {'ASPECT_GROUPING': True,
'BATCH_SIZE': 128,
'BBOX_INSIDE_WEIGHTS': [1.0, 1.0, 1.0, 1.0],
'BBOX_NORMALIZE_MEANS': [0.0, 0.0, 0.0, 0.0],
'BBOX_NORMALIZE_STDS': [0.1, 0.1, 0.2, 0.2],
'BBOX_NORMALIZE_TARGETS': True,
'BBOX_NORMALIZE_TARGETS_PRECOMPUTED': False,
'BBOX_REG': False,
'BBOX_THRESH': 0.5,
'BG_THRESH_HI': 0.5,
'BG_THRESH_LO': 0.0,
'FG_FRACTION': 0.25,
'FG_THRESH': 0.5,
'HAS_RPN': True,
'IMS_PER_BATCH': 1,
'MAX_SIZE': 1000,
'PROPOSAL_METHOD': 'gt',
'RPN_BATCHSIZE': 256,
'RPN_BBOX_INSIDE_WEIGHTS': [1.0, 1.0, 1.0, 1.0],
'RPN_CLOBBER_POSITIVES': False,
'RPN_FG_FRACTION': 0.5,
'RPN_MIN_SIZE': 16,
'RPN_NEGATIVE_OVERLAP': 0.3,
'RPN_NMS_THRESH': 0.7,
'RPN_POSITIVE_OVERLAP': 0.7,
'RPN_POSITIVE_WEIGHT': -1.0,
'RPN_POST_NMS_TOP_N': 2000,
'RPN_PRE_NMS_TOP_N': 12000,
'SCALES': [600],
'SNAPSHOT_INFIX': 'stage1',
'SNAPSHOT_ITERS': 10000,
'USE_FLIPPED': True,
'USE_PREFETCH': False},
'USE_GPU_NMS': True}
Loaded dataset voc_2007_trainval for training
Set proposal method: gt
Appending horizontally-flipped training examples...
wrote gt roidb to C:\Program Files\py-faster-rcnn\data\cache\voc_2007_trainval_gt_roidb.pkl
done
Preparing training data...
done
roidb len: 37676
Output will be saved to C:\Program Files\py-faster-rcnn\output\default\voc_2007_trainval
Filtered 0 roidb entries: 37676 -> 37676
WARNING: Logging before InitGoogleLogging() is written to STDERR
I0525 16:26:47.115679 11996 common.cpp:36] System entropy source not available, using fallback algorithm to generate seed instead.
I0525 16:27:43.645265 11996 solver.cpp:48] Initializing solver from parameters:
train_net: "models/pascal_voc/VGG16/faster_rcnn_alt_opt/stage1_rpn_train.pt"
base_lr: 0.0008
display: 20
lr_policy: "step"
gamma: 0.1
momentum: 0.9
weight_decay: 0.0005
stepsize: 20000
snapshot: 0
snapshot_prefix: "vgg16_rpn"
average_loss: 100
I0525 16:27:43.646266 11996 solver.cpp:81] Creating training net from train_net file: models/pascal_voc/VGG16/faster_rcnn_alt_opt/stage1_rpn_train.pt
I0525 16:27:43.647267 11996 net.cpp:58] Initializing net from parameters:
name: "VGG_ILSVRC_16_layers"
state {
phase: TRAIN
}
layer {
name: "input-data"
type: "Python"
top: "data"
top: "im_info"
top: "gt_boxes"
python_param {
module: "roi_data_layer.layer"
layer: "RoIDataLayer"
param_str: "'num_classes': 13"
}
}
layer {
name: "conv1_1"
type: "Convolution"
bottom: "data"
top: "conv1_1"
param {
lr_mult: 0
decay_mult: 0
}
param {
lr_mult: 0
decay_mult: 0
}
convolution_param {
num_output: 64
pad: 1
kernel_size: 3
}
}
layer {
name: "relu1_1"
type: "ReLU"
bottom: "conv1_1"
top: "conv1_1"
}
layer {
name: "conv1_2"
type: "Convolution"
bottom: "conv1_1"
top: "conv1_2"
param {
lr_mult: 0
decay_mult: 0
}
param {
lr_mult: 0
decay_mult: 0
}
convolution_param {
num_output: 64
pad: 1
kernel_size: 3
}
}
layer {
name: "relu1_2"
type: "ReLU"
bottom: "conv1_2"
top: "conv1_2"
}
layer {
name: "pool1"
type: "Pooling"
bottom: "conv1_2"
top: "pool1"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv2_1"
type: "Convolution"
bottom: "pool1"
top: "conv2_1"
param {
lr_mult: 0
decay_mult: 0
}
param {
lr_mult: 0
decay_mult: 0
}
convolution_param {
num_output: 128
pad: 1
kernel_size: 3
}
}
layer {
name: "relu2_1"
type: "ReLU"
bottom: "conv2_1"
top: "conv2_1"
}
layer {
name: "conv2_2"
type: "Convolution"
bottom: "conv2_1"
top: "conv2_2"
param {
lr_mult: 0
decay_mult: 0
}
param {
lr_mult: 0
decay_mult: 0
}
convolution_param {
num_output: 128
pad: 1
kernel_size: 3
}
}
layer {
name: "relu2_2"
type: "ReLU"
bottom: "conv2_2"
top: "conv2_2"
}
layer {
name: "pool2"
type: "Pooling"
bottom: "conv2_2"
top: "pool2"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv3_1"
type: "Convolution"
bottom: "pool2"
top: "conv3_1"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
}
}
layer {
name: "relu3_1"
type: "ReLU"
bottom: "conv3_1"
top: "conv3_1"
}
layer {
name: "conv3_2"
type: "Convolution"
bottom: "conv3_1"
top: "conv3_2"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
}
}
layer {
name: "relu3_2"
type: "ReLU"
bottom: "conv3_2"
top: "conv3_2"
}
layer {
name: "conv3_3"
type: "Convolution"
bottom: "conv3_2"
top: "conv3_3"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
}
}
layer {
name: "relu3_3"
type: "ReLU"
bottom: "conv3_3"
top: "conv3_3"
}
layer {
name: "pool3"
type: "Pooling"
bottom: "conv3_3"
top: "pool3"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv4_1"
type: "Convolution"
bottom: "pool3"
top: "conv4_1"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
}
}
layer {
name: "relu4_1"
type: "ReLU"
bottom: "conv4_1"
top: "conv4_1"
}
layer {
name: "conv4_2"
type: "Convolution"
bottom: "conv4_1"
top: "conv4_2"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
}
}
layer {
name: "relu4_2"
type: "ReLU"
bottom: "conv4_2"
top: "conv4_2"
}
layer {
name: "conv4_3"
type: "Convolution"
bottom: "conv4_2"
top: "conv4_3"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
}
}
layer {
name: "relu4_3"
type: "ReLU"
bottom: "conv4_3"
top: "conv4_3"
}
layer {
name: "pool4"
type: "Pooling"
bottom: "conv4_3"
top: "pool4"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "conv5_1"
type: "Convolution"
bottom: "pool4"
top: "conv5_1"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
}
}
layer {
name: "relu5_1"
type: "ReLU"
bottom: "conv5_1"
top: "conv5_1"
}
layer {
name: "conv5_2"
type: "Convolution"
bottom: "conv5_1"
top: "conv5_2"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
}
}
layer {
name: "relu5_2"
type: "ReLU"
bottom: "conv5_2"
top: "conv5_2"
}
layer {
name: "conv5_3"
type: "Convolution"
bottom: "conv5_2"
top: "conv5_3"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
}
}
layer {
name: "relu5_3"
type: "ReLU"
bottom: "conv5_3"
top: "conv5_3"
}
layer {
name: "rpn_conv/3x3"
type: "Convolution"
bottom: "conv5_3"
top: "rpn/output"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
convolution_param {
num_output: 512
pad: 1
kernel_size: 3
stride: 1
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0
}
}
}
layer {
name: "rpn_relu/3x3"
type: "ReLU"
bottom: "rpn/output"
top: "rpn/output"
}
layer {
name: "rpn_cls_score"
type: "Convolution"
bottom: "rpn/output"
top: "rpn_cls_score"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
convolution_param {
num_output: 18
pad: 0
kernel_size: 1
stride: 1
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0
}
}
}
layer {
name: "rpn_bbox_pred"
type: "Convolution"
bottom: "rpn/output"
top: "rpn_bbox_pred"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
convolution_param {
num_output: 36
pad: 0
kernel_size: 1
stride: 1
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
value: 0
}
}
}
layer {
name: "rpn_cls_score_reshape"
type: "Reshape"
bottom: "rpn_cls_score"
top: "rpn_cls_score_reshape"
reshape_param {
shape {
dim: 0
dim: 2
dim: -1
dim: 0
}
}
}
layer {
name: "rpn-data"
type: "Python"
bottom: "rpn_cls_score"
bottom: "gt_boxes"
bottom: "im_info"
bottom: "data"
top: "rpn_labels"
top: "rpn_bbox_targets"
top: "rpn_bbox_inside_weights"
top: "rpn_bbox_outside_weights"
python_param {
module: "rpn.anchor_target_layer"
layer: "AnchorTargetLayer"
param_str: "'feat_stride': 16"
}
}
layer {
name: "rpn_loss_cls"
type: "SoftmaxWithLoss"
bottom: "rpn_cls_score_reshape"
bottom: "rpn_labels"
top: "rpn_cls_loss"
loss_weight: 1
propagate_down: true
propagate_down: false
loss_param {
ignore_label: -1
normalize: true
}
}
layer {
name: "rpn_loss_bbox"
type: "SmoothL1Loss"
bottom: "rpn_bbox_pred"
bottom: "rpn_bbox_targets"
bottom: "rpn_bbox_inside_weights"
bottom: "rpn_bbox_outside_weights"
top: "rpn_loss_bbox"
loss_weight: 1
smooth_l1_loss_param {
sigma: 3
}
}
layer {
name: "dummy_roi_pool_conv5"
type: "DummyData"
top: "dummy_roi_pool_conv5"
dummy_data_param {
data_filler {
type: "constant"
value: 0
}
shape {
dim: 1
dim: 25088
RoiDataLayer: name_to_top: {'gt_boxes': 2, 'data': 0, 'im_info': 1}
}
}
}
layer {
name: "fc6"
type: "InnerProduct"
bottom: "dummy_roi_pool_conv5"
top: "fc6"
param {
lr_mult: 0
decay_mult: 0
}
param {
lr_mult: 0
decay_mult: 0
}
inner_product_param {
num_output: 4096
}
}
layer {
name: "relu6"
type: "ReLU"
bottom: "fc6"
top: "fc6"
}
layer {
name: "drop6"
type: "Dropout"
bottom: "fc6"
top: "fc6"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "fc7"
type: "InnerProduct"
bottom: "fc6"
top: "fc7"
param {
lr_mult: 0
decay_mult: 0
}
param {
lr_mult: 0
decay_mult: 0
}
inner_product_param {
num_output: 4096
}
}
layer {
name: "silence_fc7"
type: "Silence"
bottom: "fc7"
}
I0525 16:27:43.647267 11996 layer_factory.hpp:77] Creating layer input-data
I0525 16:27:43.690201 11996 net.cpp:100] Creating Layer input-data
I0525 16:27:43.690201 11996 net.cpp:418] input-data -> data
I0525 16:27:43.691220 11996 net.cpp:418] input-data -> im_info
I0525 16:27:43.691220 11996 net.cpp:418] input-data -> gt_boxes
I0525 16:27:43.739037 11996 net.cpp:150] Setting up input-data
I0525 16:27:43.739037 11996 net.cpp:157] Top shape: 2 3 600 1000 (3600000)
I0525 16:27:43.739037 11996 net.cpp:157] Top shape: 1 3 (3)
I0525 16:27:43.739037 11996 net.cpp:157] Top shape: 1 4 (4)
I0525 16:27:43.739037 11996 net.cpp:165] Memory required for data: 14400028
I0525 16:27:43.739037 11996 layer_factory.hpp:77] Creating layer data_input-data_0_split
I0525 16:27:43.739037 11996 net.cpp:100] Creating Layer data_input-data_0_split
I0525 16:27:43.739037 11996 net.cpp:444] data_input-data_0_split <- data
I0525 16:27:43.739037 11996 net.cpp:418] data_input-data_0_split -> data_input-data_0_split_0
I0525 16:27:43.739037 11996 net.cpp:418] data_input-data_0_split -> data_input-data_0_split_1
I0525 16:27:43.739037 11996 net.cpp:150] Setting up data_input-data_0_split
I0525 16:27:43.739037 11996 net.cpp:157] Top shape: 2 3 600 1000 (3600000)
I0525 16:27:43.739037 11996 net.cpp:157] Top shape: 2 3 600 1000 (3600000)
I0525 16:27:43.739037 11996 net.cpp:165] Memory required for data: 43200028
I0525 16:27:43.739037 11996 layer_factory.hpp:77] Creating layer conv1_1
I0525 16:27:43.739037 11996 net.cpp:100] Creating Layer conv1_1
I0525 16:27:43.739037 11996 net.cpp:444] conv1_1 <- data_input-data_0_split_0
I0525 16:27:43.739037 11996 net.cpp:418] conv1_1 -> conv1_1
I0525 16:27:44.803694 11996 net.cpp:150] Setting up conv1_1
I0525 16:27:44.803694 11996 net.cpp:157] Top shape: 2 64 600 1000 (76800000)
I0525 16:27:44.803694 11996 net.cpp:165] Memory required for data: 350400028
I0525 16:27:44.803694 11996 layer_factory.hpp:77] Creating layer relu1_1
I0525 16:27:44.803694 11996 net.cpp:100] Creating Layer relu1_1
I0525 16:27:44.803694 11996 net.cpp:444] relu1_1 <- conv1_1
I0525 16:27:44.803694 11996 net.cpp:405] relu1_1 -> conv1_1 (in-place)
I0525 16:27:44.803694 11996 net.cpp:150] Setting up relu1_1
I0525 16:27:44.803694 11996 net.cpp:157] Top shape: 2 64 600 1000 (76800000)
I0525 16:27:44.803694 11996 net.cpp:165] Memory required for data: 657600028
I0525 16:27:44.803694 11996 layer_factory.hpp:77] Creating layer conv1_2
I0525 16:27:44.803694 11996 net.cpp:100] Creating Layer conv1_2
I0525 16:27:44.803694 11996 net.cpp:444] conv1_2 <- conv1_1
I0525 16:27:44.803694 11996 net.cpp:418] conv1_2 -> conv1_2
I0525 16:27:44.808698 11996 net.cpp:150] Setting up conv1_2
I0525 16:27:44.808698 11996 net.cpp:157] Top shape: 2 64 600 1000 (76800000)
I0525 16:27:44.808698 11996 net.cpp:165] Memory required for data: 964800028
I0525 16:27:44.808698 11996 layer_factory.hpp:77] Creating layer relu1_2
I0525 16:27:44.808698 11996 net.cpp:100] Creating Layer relu1_2
I0525 16:27:44.808698 11996 net.cpp:444] relu1_2 <- conv1_2
I0525 16:27:44.808698 11996 net.cpp:405] relu1_2 -> conv1_2 (in-place)
I0525 16:27:44.809700 11996 net.cpp:150] Setting up relu1_2
I0525 16:27:44.809700 11996 net.cpp:157] Top shape: 2 64 600 1000 (76800000)
I0525 16:27:44.809700 11996 net.cpp:165] Memory required for data: 1272000028
I0525 16:27:44.809700 11996 layer_factory.hpp:77] Creating layer pool1
I0525 16:27:44.809700 11996 net.cpp:100] Creating Layer pool1
I0525 16:27:44.809700 11996 net.cpp:444] pool1 <- conv1_2
I0525 16:27:44.809700 11996 net.cpp:418] pool1 -> pool1
I0525 16:27:44.809700 11996 net.cpp:150] Setting up pool1
I0525 16:27:44.809700 11996 net.cpp:157] Top shape: 2 64 300 500 (19200000)
I0525 16:27:44.809700 11996 net.cpp:165] Memory required for data: 1348800028
I0525 16:27:44.809700 11996 layer_factory.hpp:77] Creating layer conv2_1
I0525 16:27:44.809700 11996 net.cpp:100] Creating Layer conv2_1
I0525 16:27:44.809700 11996 net.cpp:444] conv2_1 <- pool1
I0525 16:27:44.809700 11996 net.cpp:418] conv2_1 -> conv2_1
I0525 16:27:44.812702 11996 net.cpp:150] Setting up conv2_1
I0525 16:27:44.812702 11996 net.cpp:157] Top shape: 2 128 300 500 (38400000)
I0525 16:27:44.812702 11996 net.cpp:165] Memory required for data: 1502400028
I0525 16:27:44.812702 11996 layer_factory.hpp:77] Creating layer relu2_1
I0525 16:27:44.812702 11996 net.cpp:100] Creating Layer relu2_1
I0525 16:27:44.812702 11996 net.cpp:444] relu2_1 <- conv2_1
I0525 16:27:44.812702 11996 net.cpp:405] relu2_1 -> conv2_1 (in-place)
I0525 16:27:44.812702 11996 net.cpp:150] Setting up relu2_1
I0525 16:27:44.812702 11996 net.cpp:157] Top shape: 2 128 300 500 (38400000)
I0525 16:27:44.812702 11996 net.cpp:165] Memory required for data: 1656000028
I0525 16:27:44.812702 11996 layer_factory.hpp:77] Creating layer conv2_2
I0525 16:27:44.813704 11996 net.cpp:100] Creating Layer conv2_2
I0525 16:27:44.813704 11996 net.cpp:444] conv2_2 <- conv2_1
I0525 16:27:44.813726 11996 net.cpp:418] conv2_2 -> conv2_2
I0525 16:27:44.815827 11996 net.cpp:150] Setting up conv2_2
I0525 16:27:44.815827 11996 net.cpp:157] Top shape: 2 128 300 500 (38400000)
I0525 16:27:44.815827 11996 net.cpp:165] Memory required for data: 1809600028
I0525 16:27:44.815827 11996 layer_factory.hpp:77] Creating layer relu2_2
I0525 16:27:44.815827 11996 net.cpp:100] Creating Layer relu2_2
I0525 16:27:44.815827 11996 net.cpp:444] relu2_2 <- conv2_2
I0525 16:27:44.815827 11996 net.cpp:405] relu2_2 -> conv2_2 (in-place)
I0525 16:27:44.816828 11996 net.cpp:150] Setting up relu2_2
I0525 16:27:44.816828 11996 net.cpp:157] Top shape: 2 128 300 500 (38400000)
I0525 16:27:44.816828 11996 net.cpp:165] Memory required for data: 1963200028
I0525 16:27:44.816828 11996 layer_factory.hpp:77] Creating layer pool2
I0525 16:27:44.816828 11996 net.cpp:100] Creating Layer pool2
I0525 16:27:44.816828 11996 net.cpp:444] pool2 <- conv2_2
I0525 16:27:44.816828 11996 net.cpp:418] pool2 -> pool2
I0525 16:27:44.816828 11996 net.cpp:150] Setting up pool2
I0525 16:27:44.816828 11996 net.cpp:157] Top shape: 2 128 150 250 (9600000)
I0525 16:27:44.816828 11996 net.cpp:165] Memory required for data: 2001600028
I0525 16:27:44.816828 11996 layer_factory.hpp:77] Creating layer conv3_1
I0525 16:27:44.816828 11996 net.cpp:100] Creating Layer conv3_1
I0525 16:27:44.816828 11996 net.cpp:444] conv3_1 <- pool2
I0525 16:27:44.816828 11996 net.cpp:418] conv3_1 -> conv3_1
I0525 16:27:44.819831 11996 net.cpp:150] Setting up conv3_1
I0525 16:27:44.819831 11996 net.cpp:157] Top shape: 2 256 150 250 (19200000)
I0525 16:27:44.819831 11996 net.cpp:165] Memory required for data: 2078400028
I0525 16:27:44.819831 11996 layer_factory.hpp:77] Creating layer relu3_1
I0525 16:27:44.819831 11996 net.cpp:100] Creating Layer relu3_1
I0525 16:27:44.819831 11996 net.cpp:444] relu3_1 <- conv3_1
I0525 16:27:44.820832 11996 net.cpp:405] relu3_1 -> conv3_1 (in-place)
I0525 16:27:44.820832 11996 net.cpp:150] Setting up relu3_1
I0525 16:27:44.820832 11996 net.cpp:157] Top shape: 2 256 150 250 (19200000)
I0525 16:27:44.820832 11996 net.cpp:165] Memory required for data: 2155200028
I0525 16:27:44.820832 11996 layer_factory.hpp:77] Creating layer conv3_2
I0525 16:27:44.820832 11996 net.cpp:100] Creating Layer conv3_2
I0525 16:27:44.820832 11996 net.cpp:444] conv3_2 <- conv3_1
I0525 16:27:44.820832 11996 net.cpp:418] conv3_2 -> conv3_2
I0525 16:27:44.824959 11996 net.cpp:150] Setting up conv3_2
I0525 16:27:44.824959 11996 net.cpp:157] Top shape: 2 256 150 250 (19200000)
I0525 16:27:44.824959 11996 net.cpp:165] Memory required for data: 2232000028
I0525 16:27:44.824959 11996 layer_factory.hpp:77] Creating layer relu3_2
I0525 16:27:44.824959 11996 net.cpp:100] Creating Layer relu3_2
I0525 16:27:44.824959 11996 net.cpp:444] relu3_2 <- conv3_2
I0525 16:27:44.824959 11996 net.cpp:405] relu3_2 -> conv3_2 (in-place)
I0525 16:27:44.825960 11996 net.cpp:150] Setting up relu3_2
I0525 16:27:44.825960 11996 net.cpp:157] Top shape: 2 256 150 250 (19200000)
I0525 16:27:44.825960 11996 net.cpp:165] Memory required for data: 2308800028
I0525 16:27:44.825960 11996 layer_factory.hpp:77] Creating layer conv3_3
I0525 16:27:44.825960 11996 net.cpp:100] Creating Layer conv3_3
I0525 16:27:44.825960 11996 net.cpp:444] conv3_3 <- conv3_2
I0525 16:27:44.825960 11996 net.cpp:418] conv3_3 -> conv3_3
I0525 16:27:44.830085 11996 net.cpp:150] Setting up conv3_3
I0525 16:27:44.830085 11996 net.cpp:157] Top shape: 2 256 150 250 (19200000)
I0525 16:27:44.830085 11996 net.cpp:165] Memory required for data: 2385600028
I0525 16:27:44.830085 11996 layer_factory.hpp:77] Creating layer relu3_3
I0525 16:27:44.830085 11996 net.cpp:100] Creating Layer relu3_3
I0525 16:27:44.830085 11996 net.cpp:444] relu3_3 <- conv3_3
I0525 16:27:44.830085 11996 net.cpp:405] relu3_3 -> conv3_3 (in-place)
I0525 16:27:44.831085 11996 net.cpp:150] Setting up relu3_3
I0525 16:27:44.832087 11996 net.cpp:157] Top shape: 2 256 150 250 (19200000)
I0525 16:27:44.832087 11996 net.cpp:165] Memory required for data: 2462400028
I0525 16:27:44.832087 11996 layer_factory.hpp:77] Creating layer pool3
I0525 16:27:44.832087 11996 net.cpp:100] Creating Layer pool3
I0525 16:27:44.832087 11996 net.cpp:444] pool3 <- conv3_3
I0525 16:27:44.832087 11996 net.cpp:418] pool3 -> pool3
I0525 16:27:44.832087 11996 net.cpp:150] Setting up pool3
I0525 16:27:44.832087 11996 net.cpp:157] Top shape: 2 256 75 125 (4800000)
I0525 16:27:44.832087 11996 net.cpp:165] Memory required for data: 2481600028
I0525 16:27:44.832087 11996 layer_factory.hpp:77] Creating layer conv4_1
I0525 16:27:44.832087 11996 net.cpp:100] Creating Layer conv4_1
I0525 16:27:44.832087 11996 net.cpp:444] conv4_1 <- pool3
I0525 16:27:44.832087 11996 net.cpp:418] conv4_1 -> conv4_1
I0525 16:27:44.843369 11996 net.cpp:150] Setting up conv4_1
I0525 16:27:44.843369 11996 net.cpp:157] Top shape: 2 512 75 125 (9600000)
I0525 16:27:44.843369 11996 net.cpp:165] Memory required for data: 2520000028
I0525 16:27:44.843369 11996 layer_factory.hpp:77] Creating layer relu4_1
I0525 16:27:44.843369 11996 net.cpp:100] Creating Layer relu4_1
I0525 16:27:44.843369 11996 net.cpp:444] relu4_1 <- conv4_1
I0525 16:27:44.843369 11996 net.cpp:405] relu4_1 -> conv4_1 (in-place)
I0525 16:27:44.845371 11996 net.cpp:150] Setting up relu4_1
I0525 16:27:44.845371 11996 net.cpp:157] Top shape: 2 512 75 125 (9600000)
I0525 16:27:44.845371 11996 net.cpp:165] Memory required for data: 2558400028
I0525 16:27:44.845371 11996 layer_factory.hpp:77] Creating layer conv4_2
I0525 16:27:44.845371 11996 net.cpp:100] Creating Layer conv4_2
I0525 16:27:44.845371 11996 net.cpp:444] conv4_2 <- conv4_1
I0525 16:27:44.845371 11996 net.cpp:418] conv4_2 -> conv4_2
I0525 16:27:44.854514 11996 net.cpp:150] Setting up conv4_2
I0525 16:27:44.854514 11996 net.cpp:157] Top shape: 2 512 75 125 (9600000)
I0525 16:27:44.854514 11996 net.cpp:165] Memory required for data: 2596800028
I0525 16:27:44.854514 11996 layer_factory.hpp:77] Creating layer relu4_2
I0525 16:27:44.854514 11996 net.cpp:100] Creating Layer relu4_2
I0525 16:27:44.854514 11996 net.cpp:444] relu4_2 <- conv4_2
I0525 16:27:44.854514 11996 net.cpp:405] relu4_2 -> conv4_2 (in-place)
I0525 16:27:44.855515 11996 net.cpp:150] Setting up relu4_2
I0525 16:27:44.855515 11996 net.cpp:157] Top shape: 2 512 75 125 (9600000)
I0525 16:27:44.855515 11996 net.cpp:165] Memory required for data: 2635200028
I0525 16:27:44.855515 11996 layer_factory.hpp:77] Creating layer conv4_3
I0525 16:27:44.855515 11996 net.cpp:100] Creating Layer conv4_3
I0525 16:27:44.855515 11996 net.cpp:444] conv4_3 <- conv4_2
I0525 16:27:44.855515 11996 net.cpp:418] conv4_3 -> conv4_3
I0525 16:27:44.867650 11996 net.cpp:150] Setting up conv4_3
I0525 16:27:44.867650 11996 net.cpp:157] Top shape: 2 512 75 125 (9600000)
I0525 16:27:44.867650 11996 net.cpp:165] Memory required for data: 2673600028
I0525 16:27:44.867650 11996 layer_factory.hpp:77] Creating layer relu4_3
I0525 16:27:44.867650 11996 net.cpp:100] Creating Layer relu4_3
I0525 16:27:44.867650 11996 net.cpp:444] relu4_3 <- conv4_3
I0525 16:27:44.867650 11996 net.cpp:405] relu4_3 -> conv4_3 (in-place)
I0525 16:27:44.868650 11996 net.cpp:150] Setting up relu4_3
I0525 16:27:44.868650 11996 net.cpp:157] Top shape: 2 512 75 125 (9600000)
I0525 16:27:44.868650 11996 net.cpp:165] Memory required for data: 2712000028
I0525 16:27:44.868650 11996 layer_factory.hpp:77] Creating layer pool4
I0525 16:27:44.868650 11996 net.cpp:100] Creating Layer pool4
I0525 16:27:44.868650 11996 net.cpp:444] pool4 <- conv4_3
I0525 16:27:44.868650 11996 net.cpp:418] pool4 -> pool4
I0525 16:27:44.868650 11996 net.cpp:150] Setting up pool4
I0525 16:27:44.868650 11996 net.cpp:157] Top shape: 2 512 38 63 (2451456)
I0525 16:27:44.868650 11996 net.cpp:165] Memory required for data: 2721805852
I0525 16:27:44.868650 11996 layer_factory.hpp:77] Creating layer conv5_1
I0525 16:27:44.868650 11996 net.cpp:100] Creating Layer conv5_1
I0525 16:27:44.868650 11996 net.cpp:444] conv5_1 <- pool4
I0525 16:27:44.868650 11996 net.cpp:418] conv5_1 -> conv5_1
I0525 16:27:44.878774 11996 net.cpp:150] Setting up conv5_1
I0525 16:27:44.878774 11996 net.cpp:157] Top shape: 2 512 38 63 (2451456)
I0525 16:27:44.878774 11996 net.cpp:165] Memory required for data: 2731611676
I0525 16:27:44.878774 11996 layer_factory.hpp:77] Creating layer relu5_1
I0525 16:27:44.878774 11996 net.cpp:100] Creating Layer relu5_1
I0525 16:27:44.878774 11996 net.cpp:444] relu5_1 <- conv5_1
I0525 16:27:44.878774 11996 net.cpp:405] relu5_1 -> conv5_1 (in-place)
I0525 16:27:44.878774 11996 net.cpp:150] Setting up relu5_1
I0525 16:27:44.879776 11996 net.cpp:157] Top shape: 2 512 38 63 (2451456)
I0525 16:27:44.879776 11996 net.cpp:165] Memory required for data: 2741417500
I0525 16:27:44.879776 11996 layer_factory.hpp:77] Creating layer conv5_2
I0525 16:27:44.879776 11996 net.cpp:100] Creating Layer conv5_2
I0525 16:27:44.879776 11996 net.cpp:444] conv5_2 <- conv5_1
I0525 16:27:44.879776 11996 net.cpp:418] conv5_2 -> conv5_2
I0525 16:27:44.889920 11996 net.cpp:150] Setting up conv5_2
I0525 16:27:44.889920 11996 net.cpp:157] Top shape: 2 512 38 63 (2451456)
I0525 16:27:44.889920 11996 net.cpp:165] Memory required for data: 2751223324
I0525 16:27:44.889920 11996 layer_factory.hpp:77] Creating layer relu5_2
I0525 16:27:44.889920 11996 net.cpp:100] Creating Layer relu5_2
I0525 16:27:44.889920 11996 net.cpp:444] relu5_2 <- conv5_2
I0525 16:27:44.889920 11996 net.cpp:405] relu5_2 -> conv5_2 (in-place)
I0525 16:27:44.889920 11996 net.cpp:150] Setting up relu5_2
I0525 16:27:44.889920 11996 net.cpp:157] Top shape: 2 512 38 63 (2451456)
I0525 16:27:44.889920 11996 net.cpp:165] Memory required for data: 2761029148
I0525 16:27:44.889920 11996 layer_factory.hpp:77] Creating layer conv5_3
I0525 16:27:44.889920 11996 net.cpp:100] Creating Layer conv5_3
I0525 16:27:44.889920 11996 net.cpp:444] conv5_3 <- conv5_2
I0525 16:27:44.889920 11996 net.cpp:418] conv5_3 -> conv5_3
I0525 16:27:44.897928 11996 net.cpp:150] Setting up conv5_3
I0525 16:27:44.897928 11996 net.cpp:157] Top shape: 2 512 38 63 (2451456)
I0525 16:27:44.897928 11996 net.cpp:165] Memory required for data: 2770834972
I0525 16:27:44.897928 11996 layer_factory.hpp:77] Creating layer relu5_3
I0525 16:27:44.897928 11996 net.cpp:100] Creating Layer relu5_3
I0525 16:27:44.897928 11996 net.cpp:444] relu5_3 <- conv5_3
I0525 16:27:44.897928 11996 net.cpp:405] relu5_3 -> conv5_3 (in-place)
I0525 16:27:44.898929 11996 net.cpp:150] Setting up relu5_3
I0525 16:27:44.898929 11996 net.cpp:157] Top shape: 2 512 38 63 (2451456)
I0525 16:27:44.898929 11996 net.cpp:165] Memory required for data: 2780640796
I0525 16:27:44.898929 11996 layer_factory.hpp:77] Creating layer rpn_conv/3x3
I0525 16:27:44.898929 11996 net.cpp:100] Creating Layer rpn_conv/3x3
I0525 16:27:44.898929 11996 net.cpp:444] rpn_conv/3x3 <- conv5_3
I0525 16:27:44.898929 11996 net.cpp:418] rpn_conv/3x3 -> rpn/output
I0525 16:27:44.955188 11996 net.cpp:150] Setting up rpn_conv/3x3
I0525 16:27:44.955188 11996 net.cpp:157] Top shape: 2 512 38 63 (2451456)
I0525 16:27:44.955188 11996 net.cpp:165] Memory required for data: 2790446620
I0525 16:27:44.955188 11996 layer_factory.hpp:77] Creating layer rpn_relu/3x3
I0525 16:27:44.955188 11996 net.cpp:100] Creating Layer rpn_relu/3x3
I0525 16:27:44.955188 11996 net.cpp:444] rpn_relu/3x3 <- rpn/output
I0525 16:27:44.955188 11996 net.cpp:405] rpn_relu/3x3 -> rpn/output (in-place)
I0525 16:27:44.956189 11996 net.cpp:150] Setting up rpn_relu/3x3
I0525 16:27:44.956189 11996 net.cpp:157] Top shape: 2 512 38 63 (2451456)
I0525 16:27:44.956189 11996 net.cpp:165] Memory required for data: 2800252444
I0525 16:27:44.956189 11996 layer_factory.hpp:77] Creating layer rpn/output_rpn_relu/3x3_0_split
I0525 16:27:44.956189 11996 net.cpp:100] Creating Layer rpn/output_rpn_relu/3x3_0_split
I0525 16:27:44.956189 11996 net.cpp:444] rpn/output_rpn_relu/3x3_0_split <- rpn/output
I0525 16:27:44.956189 11996 net.cpp:418] rpn/output_rpn_relu/3x3_0_split -> rpn/output_rpn_relu/3x3_0_split_0
I0525 16:27:44.956189 11996 net.cpp:418] rpn/output_rpn_relu/3x3_0_split -> rpn/output_rpn_relu/3x3_0_split_1
I0525 16:27:44.956189 11996 net.cpp:150] Setting up rpn/output_rpn_relu/3x3_0_split
I0525 16:27:44.956189 11996 net.cpp:157] Top shape: 2 512 38 63 (2451456)
I0525 16:27:44.956189 11996 net.cpp:157] Top shape: 2 512 38 63 (2451456)
I0525 16:27:44.956189 11996 net.cpp:165] Memory required for data: 2819864092
I0525 16:27:44.956189 11996 layer_factory.hpp:77] Creating layer rpn_cls_score
I0525 16:27:44.956189 11996 net.cpp:100] Creating Layer rpn_cls_score
I0525 16:27:44.956189 11996 net.cpp:444] rpn_cls_score <- rpn/output_rpn_relu/3x3_0_split_0
I0525 16:27:44.956189 11996 net.cpp:418] rpn_cls_score -> rpn_cls_score
I0525 16:27:44.959316 11996 net.cpp:150] Setting up rpn_cls_score
I0525 16:27:44.959316 11996 net.cpp:157] Top shape: 2 18 38 63 (86184)
I0525 16:27:44.959316 11996 net.cpp:165] Memory required for data: 2820208828
I0525 16:27:44.959316 11996 layer_factory.hpp:77] Creating layer rpn_cls_score_rpn_cls_score_0_split
I0525 16:27:44.959316 11996 net.cpp:100] Creating Layer rpn_cls_score_rpn_cls_score_0_split
I0525 16:27:44.959316 11996 net.cpp:444] rpn_cls_score_rpn_cls_score_0_split <- rpn_cls_score
I0525 16:27:44.959316 11996 net.cpp:418] rpn_cls_score_rpn_cls_score_0_split -> rpn_cls_score_rpn_cls_score_0_split_0
I0525 16:27:44.959316 11996 net.cpp:418] rpn_cls_score_rpn_cls_score_0_split -> rpn_cls_score_rpn_cls_score_0_split_1
I0525 16:27:44.959316 11996 net.cpp:150] Setting up rpn_cls_score_rpn_cls_score_0_split
I0525 16:27:44.959316 11996 net.cpp:157] Top shape: 2 18 38 63 (86184)
I0525 16:27:44.959316 11996 net.cpp:157] Top shape: 2 18 38 63 (86184)
I0525 16:27:44.959316 11996 net.cpp:165] Memory required for data: 2820898300
I0525 16:27:44.959316 11996 layer_factory.hpp:77] Creating layer rpn_bbox_pred
I0525 16:27:44.959316 11996 net.cpp:100] Creating Layer rpn_bbox_pred
I0525 16:27:44.959316 11996 net.cpp:444] rpn_bbox_pred <- rpn/output_rpn_relu/3x3_0_split_1
I0525 16:27:44.959316 11996 net.cpp:418] rpn_bbox_pred -> rpn_bbox_pred
I0525 16:27:44.961428 11996 net.cpp:150] Setting up rpn_bbox_pred
I0525 16:27:44.961428 11996 net.cpp:157] Top shape: 2 36 38 63 (172368)
I0525 16:27:44.961428 11996 net.cpp:165] Memory required for data: 2821587772
I0525 16:27:44.961428 11996 layer_factory.hpp:77] Creating layer rpn_cls_score_reshape
I0525 16:27:44.961428 11996 net.cpp:100] Creating Layer rpn_cls_score_reshape
I0525 16:27:44.961428 11996 net.cpp:444] rpn_cls_score_reshape <- rpn_cls_score_rpn_cls_score_0_split_0
I0525 16:27:44.961428 11996 net.cpp:418] rpn_cls_score_reshape -> rpn_cls_score_reshape
I0525 16:27:44.961428 11996 net.cpp:150] Setting up rpn_cls_score_reshape
I0525 16:27:4*** Check failure stack trace: ***

@surajitsaikia27
Copy link

This problem generally occurs when you don't have enough memory in GPU, or your GPU is occupied by another program.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants