Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GPU memory requirements for the semantic segmentation example #1842

Open
DarylWM opened this issue Oct 3, 2017 · 5 comments
Open

GPU memory requirements for the semantic segmentation example #1842

DarylWM opened this issue Oct 3, 2017 · 5 comments

Comments

@DarylWM
Copy link

DarylWM commented Oct 3, 2017

I've followed the steps in the example and it runs out of memory on a K80 with 12GB of GPU memory. How much memory does the example assume?

EDIT
I'm using:

  • DIGITS 6.0.0-rc.2
  • Caffe 0.15.14 (NVIDIA flavour)
  • Batch size = 1
  • All other model / training set settings as per the example

The caffe_output.log:
I1003 02:03:05.776233 1490 upgrade_proto.cpp:1044] Attempting to upgrade input file specified using deprecated 'solver_type' field (enum)': /home/ubuntu/digits/digits/jobs/20171003-020304-5017/solver.prototxt
I1003 02:03:05.777061 1490 upgrade_proto.cpp:1051] Successfully upgraded file specified using deprecated 'solver_type' field (enum) to 'type' field (string).
W1003 02:03:05.777073 1490 upgrade_proto.cpp:1053] Note that future Caffe releases will only support 'type' field (string) for a solver's type.
I1003 02:03:05.920138 1490 caffe.cpp:197] Using GPUs 0
I1003 02:03:05.920346 1490 caffe.cpp:202] GPU 0: Tesla K80
I1003 02:03:07.003394 1490 solver.cpp:48] Initializing solver from parameters:
test_iter: 1449
test_interval: 1464
base_lr: 0.0001
display: 183
max_iter: 43920
lr_policy: "step"
gamma: 0.1
momentum: 0.9
weight_decay: 1e-06
stepsize: 14494
snapshot: 1464
snapshot_prefix: "snapshot"
solver_mode: GPU
device_id: 0
net: "train_val.prototxt"
type: "SGD"
I1003 02:03:07.003607 1490 solver.cpp:91] Creating training net from net file: train_val.prototxt
I1003 02:03:07.004962 1490 net.cpp:323] The NetState phase (0) differed from the phase (1) specified by a rule in layer data
I1003 02:03:07.004979 1490 net.cpp:323] The NetState phase (0) differed from the phase (1) specified by a rule in layer label
I1003 02:03:07.005002 1490 net.cpp:323] The NetState phase (0) differed from the phase (1) specified by a rule in layer accuracy
I1003 02:03:07.005257 1490 net.cpp:52] Initializing net from parameters:
state {
phase: TRAIN
}
layer {
name: "data"
type: "Data"
top: "data"
include {
phase: TRAIN
}
data_param {
source: "/home/ubuntu/digits/digits/jobs/20171002-224732-1d65/train_db/features"
batch_size: 1
backend: LMDB
}
}
layer {
name: "label"
type: "Data"
top: "label"
include {
phase: TRAIN
}
data_param {
source: "/home/ubuntu/digits/digits/jobs/20171002-224732-1d65/train_db/labels"
batch_size: 1
backend: LMDB
}
}
layer {
name: "shift"
type: "Power"
bottom: "data"
top: "data_preprocessed"
power_param {
shift: -116
}
}
layer {
name: "conv1"
type: "Convolution"
bottom: "data_preprocessed"
top: "conv1"
convolution_param {
num_output: 96
pad: 100
kernel_size: 11
group: 1
stride: 4
}
}
layer {
name: "relu1"
type: "ReLU"
bottom: "conv1"
top: "conv1"
}
layer {
name: "pool1"
type: "Pooling"
bottom: "conv1"
top: "pool1"
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: "norm1"
type: "LRN"
bottom: "pool1"
top: "norm1"
lrn_param {
local_size: 5
alpha: 0.0001
beta: 0.75
}
}
layer {
name: "conv2"
type: "Convolution"
bottom: "norm1"
top: "conv2"
convolution_param {
num_output: 256
pad: 2
kernel_size: 5
group: 2
stride: 1
}
}
layer {
name: "relu2"
type: "ReLU"
bottom: "conv2"
top: "conv2"
}
layer {
name: "pool2"
type: "Pooling"
bottom: "conv2"
top: "pool2"
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: "norm2"
type: "LRN"
bottom: "pool2"
top: "norm2"
lrn_param {
local_size: 5
alpha: 0.0001
beta: 0.75
}
}
layer {
name: "conv3"
type: "Convolution"
bottom: "norm2"
top: "conv3"
convolution_param {
num_output: 384
pad: 1
kernel_size: 3
group: 1
stride: 1
}
}
layer {
name: "relu3"
type: "ReLU"
bottom: "conv3"
top: "conv3"
}
layer {
name: "conv4"
type: "Convolution"
bottom: "conv3"
top: "conv4"
convolution_param {
num_output: 384
pad: 1
kernel_size: 3
group: 2
stride: 1
}
}
layer {
name: "relu4"
type: "ReLU"
bottom: "conv4"
top: "conv4"
}
layer {
name: "conv5"
type: "Convolution"
bottom: "conv4"
top: "conv5"
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
group: 2
stride: 1
}
}
layer {
name: "relu5"
type: "ReLU"
bottom: "conv5"
top: "conv5"
}
layer {
name: "pool5"
type: "Pooling"
bottom: "conv5"
top: "pool5"
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: "fc6"
type: "Convolution"
bottom: "pool5"
top: "fc6"
convolution_param {
num_output: 4096
pad: 0
kernel_size: 6
group: 1
stride: 1
}
}
layer {
name: "relu6"
type: "ReLU"
bottom: "fc6"
top: "fc6"
}
layer {
name: "drop6"
type: "Dropout"
bottom: "fc6"
top: "fc6"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "fc7"
type: "Convolution"
bottom: "fc6"
top: "fc7"
convolution_param {
num_output: 4096
pad: 0
kernel_size: 1
group: 1
stride: 1
}
}
layer {
name: "relu7"
type: "ReLU"
bottom: "fc7"
top: "fc7"
}
layer {
name: "drop7"
type: "Dropout"
bottom: "fc7"
top: "fc7"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "score_fr"
type: "Convolution"
bottom: "fc7"
top: "score_fr"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 21
pad: 0
kernel_size: 1
}
}
layer {
name: "upscore"
type: "Deconvolution"
bottom: "score_fr"
top: "upscore"
param {
lr_mult: 0
}
convolution_param {
num_output: 21
bias_term: false
kernel_size: 63
group: 21
stride: 32
weight_filler {
type: "bilinear"
}
}
}
layer {
name: "score"
type: "Crop"
bottom: "upscore"
bottom: "data"
top: "score"
crop_param {
axis: 2
offset: 18
}
}
layer {
name: "loss"
type: "SoftmaxWithLoss"
bottom: "score"
bottom: "label"
top: "loss"
loss_param {
ignore_label: 255
normalize: true
}
}
I1003 02:03:07.005373 1490 layer_factory.hpp:77] Creating layer data
I1003 02:03:07.008105 1490 net.cpp:94] Creating Layer data
I1003 02:03:07.008127 1490 net.cpp:409] data -> data
I1003 02:03:07.009842 1493 db_lmdb.cpp:35] Opened lmdb /home/ubuntu/digits/digits/jobs/20171002-224732-1d65/train_db/features
I1003 02:03:07.033555 1490 data_layer.cpp:78] ReshapePrefetch 1, 3, 335, 500
I1003 02:03:07.033599 1490 data_layer.cpp:83] output data size: 1,3,335,500
I1003 02:03:07.039163 1490 net.cpp:144] Setting up data
I1003 02:03:07.039182 1490 net.cpp:151] Top shape: 1 3 335 500 (502500)
I1003 02:03:07.039186 1490 net.cpp:159] Memory required for data: 2010000
I1003 02:03:07.039192 1490 layer_factory.hpp:77] Creating layer data_data_0_split
I1003 02:03:07.039206 1490 net.cpp:94] Creating Layer data_data_0_split
I1003 02:03:07.039211 1490 net.cpp:435] data_data_0_split <- data
I1003 02:03:07.039216 1490 net.cpp:409] data_data_0_split -> data_data_0_split_0
I1003 02:03:07.039224 1490 net.cpp:409] data_data_0_split -> data_data_0_split_1
I1003 02:03:07.039296 1490 net.cpp:144] Setting up data_data_0_split
I1003 02:03:07.039304 1490 net.cpp:151] Top shape: 1 3 335 500 (502500)
I1003 02:03:07.039309 1490 net.cpp:151] Top shape: 1 3 335 500 (502500)
I1003 02:03:07.039310 1490 net.cpp:159] Memory required for data: 6030000
I1003 02:03:07.039314 1490 layer_factory.hpp:77] Creating layer label
I1003 02:03:07.039446 1490 net.cpp:94] Creating Layer label
I1003 02:03:07.039453 1490 net.cpp:409] label -> label
I1003 02:03:07.045753 1499 db_lmdb.cpp:35] Opened lmdb /home/ubuntu/digits/digits/jobs/20171002-224732-1d65/train_db/labels
I1003 02:03:07.048089 1490 data_layer.cpp:78] ReshapePrefetch 1, 1, 335, 500
I1003 02:03:07.048131 1490 data_layer.cpp:83] output data size: 1,1,335,500
I1003 02:03:07.051996 1490 net.cpp:144] Setting up label
I1003 02:03:07.052011 1490 net.cpp:151] Top shape: 1 1 335 500 (167500)
I1003 02:03:07.052014 1490 net.cpp:159] Memory required for data: 6700000
I1003 02:03:07.052018 1490 layer_factory.hpp:77] Creating layer shift
I1003 02:03:07.052026 1490 net.cpp:94] Creating Layer shift
I1003 02:03:07.052028 1490 net.cpp:435] shift <- data_data_0_split_0
I1003 02:03:07.052038 1490 net.cpp:409] shift -> data_preprocessed
I1003 02:03:07.052080 1490 net.cpp:144] Setting up shift
I1003 02:03:07.052088 1490 net.cpp:151] Top shape: 1 3 335 500 (502500)
I1003 02:03:07.052090 1490 net.cpp:159] Memory required for data: 8710000
I1003 02:03:07.052094 1490 layer_factory.hpp:77] Creating layer conv1
I1003 02:03:07.052109 1490 net.cpp:94] Creating Layer conv1
I1003 02:03:07.052114 1490 net.cpp:435] conv1 <- data_preprocessed
I1003 02:03:07.052132 1490 net.cpp:409] conv1 -> conv1
I1003 02:03:07.052968 1490 net.cpp:144] Setting up conv1
I1003 02:03:07.052980 1490 net.cpp:151] Top shape: 1 96 132 173 (2192256)
I1003 02:03:07.052983 1490 net.cpp:159] Memory required for data: 17479024
I1003 02:03:07.052999 1490 layer_factory.hpp:77] Creating layer relu1
I1003 02:03:07.053006 1490 net.cpp:94] Creating Layer relu1
I1003 02:03:07.053009 1490 net.cpp:435] relu1 <- conv1
I1003 02:03:07.053014 1490 net.cpp:396] relu1 -> conv1 (in-place)
I1003 02:03:07.053025 1490 net.cpp:144] Setting up relu1
I1003 02:03:07.053030 1490 net.cpp:151] Top shape: 1 96 132 173 (2192256)
I1003 02:03:07.053032 1490 net.cpp:159] Memory required for data: 26248048
I1003 02:03:07.053035 1490 layer_factory.hpp:77] Creating layer pool1
I1003 02:03:07.053040 1490 net.cpp:94] Creating Layer pool1
I1003 02:03:07.053043 1490 net.cpp:435] pool1 <- conv1
I1003 02:03:07.053047 1490 net.cpp:409] pool1 -> pool1
I1003 02:03:07.053169 1490 net.cpp:144] Setting up pool1
I1003 02:03:07.053177 1490 net.cpp:151] Top shape: 1 96 66 86 (544896)
I1003 02:03:07.053180 1490 net.cpp:159] Memory required for data: 28427632
I1003 02:03:07.053184 1490 layer_factory.hpp:77] Creating layer norm1
I1003 02:03:07.053192 1490 net.cpp:94] Creating Layer norm1
I1003 02:03:07.053195 1490 net.cpp:435] norm1 <- pool1
I1003 02:03:07.053200 1490 net.cpp:409] norm1 -> norm1
I1003 02:03:07.053930 1490 net.cpp:144] Setting up norm1
I1003 02:03:07.053941 1490 net.cpp:151] Top shape: 1 96 66 86 (544896)
I1003 02:03:07.053944 1490 net.cpp:159] Memory required for data: 30607216
I1003 02:03:07.053948 1490 layer_factory.hpp:77] Creating layer conv2
I1003 02:03:07.053959 1490 net.cpp:94] Creating Layer conv2
I1003 02:03:07.053962 1490 net.cpp:435] conv2 <- norm1
I1003 02:03:07.053967 1490 net.cpp:409] conv2 -> conv2
I1003 02:03:07.055606 1490 net.cpp:144] Setting up conv2
I1003 02:03:07.055624 1490 net.cpp:151] Top shape: 1 256 66 86 (1453056)
I1003 02:03:07.055629 1490 net.cpp:159] Memory required for data: 36419440
I1003 02:03:07.055642 1490 layer_factory.hpp:77] Creating layer relu2
I1003 02:03:07.055651 1490 net.cpp:94] Creating Layer relu2
I1003 02:03:07.055656 1490 net.cpp:435] relu2 <- conv2
I1003 02:03:07.055665 1490 net.cpp:396] relu2 -> conv2 (in-place)
I1003 02:03:07.055675 1490 net.cpp:144] Setting up relu2
I1003 02:03:07.055681 1490 net.cpp:151] Top shape: 1 256 66 86 (1453056)
I1003 02:03:07.055686 1490 net.cpp:159] Memory required for data: 42231664
I1003 02:03:07.055691 1490 layer_factory.hpp:77] Creating layer pool2
I1003 02:03:07.055697 1490 net.cpp:94] Creating Layer pool2
I1003 02:03:07.055703 1490 net.cpp:435] pool2 <- conv2
I1003 02:03:07.055709 1490 net.cpp:409] pool2 -> pool2
I1003 02:03:07.055759 1490 net.cpp:144] Setting up pool2
I1003 02:03:07.055768 1490 net.cpp:151] Top shape: 1 256 33 43 (363264)
I1003 02:03:07.055773 1490 net.cpp:159] Memory required for data: 43684720
I1003 02:03:07.055778 1490 layer_factory.hpp:77] Creating layer norm2
I1003 02:03:07.055785 1490 net.cpp:94] Creating Layer norm2
I1003 02:03:07.055789 1490 net.cpp:435] norm2 <- pool2
I1003 02:03:07.055797 1490 net.cpp:409] norm2 -> norm2
I1003 02:03:07.055840 1490 net.cpp:144] Setting up norm2
I1003 02:03:07.055848 1490 net.cpp:151] Top shape: 1 256 33 43 (363264)
I1003 02:03:07.055852 1490 net.cpp:159] Memory required for data: 45137776
I1003 02:03:07.055858 1490 layer_factory.hpp:77] Creating layer conv3
I1003 02:03:07.055868 1490 net.cpp:94] Creating Layer conv3
I1003 02:03:07.055873 1490 net.cpp:435] conv3 <- norm2
I1003 02:03:07.055879 1490 net.cpp:409] conv3 -> conv3
I1003 02:03:07.059195 1490 net.cpp:144] Setting up conv3
I1003 02:03:07.059207 1490 net.cpp:151] Top shape: 1 384 33 43 (544896)
I1003 02:03:07.059211 1490 net.cpp:159] Memory required for data: 47317360
I1003 02:03:07.059219 1490 layer_factory.hpp:77] Creating layer relu3
I1003 02:03:07.059226 1490 net.cpp:94] Creating Layer relu3
I1003 02:03:07.059229 1490 net.cpp:435] relu3 <- conv3
I1003 02:03:07.059247 1490 net.cpp:396] relu3 -> conv3 (in-place)
I1003 02:03:07.059254 1490 net.cpp:144] Setting up relu3
I1003 02:03:07.059259 1490 net.cpp:151] Top shape: 1 384 33 43 (544896)
I1003 02:03:07.059262 1490 net.cpp:159] Memory required for data: 49496944
I1003 02:03:07.059264 1490 layer_factory.hpp:77] Creating layer conv4
I1003 02:03:07.059272 1490 net.cpp:94] Creating Layer conv4
I1003 02:03:07.059274 1490 net.cpp:435] conv4 <- conv3
I1003 02:03:07.059279 1490 net.cpp:409] conv4 -> conv4
I1003 02:03:07.061383 1490 net.cpp:144] Setting up conv4
I1003 02:03:07.061429 1490 net.cpp:151] Top shape: 1 384 33 43 (544896)
I1003 02:03:07.061440 1490 net.cpp:159] Memory required for data: 51676528
I1003 02:03:07.061475 1490 layer_factory.hpp:77] Creating layer relu4
I1003 02:03:07.061502 1490 net.cpp:94] Creating Layer relu4
I1003 02:03:07.061533 1490 net.cpp:435] relu4 <- conv4
I1003 02:03:07.061547 1490 net.cpp:396] relu4 -> conv4 (in-place)
I1003 02:03:07.061558 1490 net.cpp:144] Setting up relu4
I1003 02:03:07.061564 1490 net.cpp:151] Top shape: 1 384 33 43 (544896)
I1003 02:03:07.061569 1490 net.cpp:159] Memory required for data: 53856112
I1003 02:03:07.061573 1490 layer_factory.hpp:77] Creating layer conv5
I1003 02:03:07.061581 1490 net.cpp:94] Creating Layer conv5
I1003 02:03:07.061584 1490 net.cpp:435] conv5 <- conv4
I1003 02:03:07.061589 1490 net.cpp:409] conv5 -> conv5
I1003 02:03:07.062772 1490 net.cpp:144] Setting up conv5
I1003 02:03:07.062783 1490 net.cpp:151] Top shape: 1 256 33 43 (363264)
I1003 02:03:07.062786 1490 net.cpp:159] Memory required for data: 55309168
I1003 02:03:07.062795 1490 layer_factory.hpp:77] Creating layer relu5
I1003 02:03:07.062803 1490 net.cpp:94] Creating Layer relu5
I1003 02:03:07.062805 1490 net.cpp:435] relu5 <- conv5
I1003 02:03:07.062810 1490 net.cpp:396] relu5 -> conv5 (in-place)
I1003 02:03:07.062818 1490 net.cpp:144] Setting up relu5
I1003 02:03:07.062821 1490 net.cpp:151] Top shape: 1 256 33 43 (363264)
I1003 02:03:07.062824 1490 net.cpp:159] Memory required for data: 56762224
I1003 02:03:07.062826 1490 layer_factory.hpp:77] Creating layer pool5
I1003 02:03:07.062831 1490 net.cpp:94] Creating Layer pool5
I1003 02:03:07.062834 1490 net.cpp:435] pool5 <- conv5
I1003 02:03:07.062839 1490 net.cpp:409] pool5 -> pool5
I1003 02:03:07.062873 1490 net.cpp:144] Setting up pool5
I1003 02:03:07.062878 1490 net.cpp:151] Top shape: 1 256 16 21 (86016)
I1003 02:03:07.062881 1490 net.cpp:159] Memory required for data: 57106288
I1003 02:03:07.062885 1490 layer_factory.hpp:77] Creating layer fc6
I1003 02:03:07.062891 1490 net.cpp:94] Creating Layer fc6
I1003 02:03:07.062893 1490 net.cpp:435] fc6 <- pool5
I1003 02:03:07.062899 1490 net.cpp:409] fc6 -> fc6
I1003 02:03:07.166028 1490 net.cpp:144] Setting up fc6
I1003 02:03:07.166061 1490 net.cpp:151] Top shape: 1 4096 11 16 (720896)
I1003 02:03:07.166065 1490 net.cpp:159] Memory required for data: 59989872
I1003 02:03:07.166074 1490 layer_factory.hpp:77] Creating layer relu6
I1003 02:03:07.166085 1490 net.cpp:94] Creating Layer relu6
I1003 02:03:07.166090 1490 net.cpp:435] relu6 <- fc6
I1003 02:03:07.166096 1490 net.cpp:396] relu6 -> fc6 (in-place)
I1003 02:03:07.166108 1490 net.cpp:144] Setting up relu6
I1003 02:03:07.166112 1490 net.cpp:151] Top shape: 1 4096 11 16 (720896)
I1003 02:03:07.166115 1490 net.cpp:159] Memory required for data: 62873456
I1003 02:03:07.166117 1490 layer_factory.hpp:77] Creating layer drop6
I1003 02:03:07.166124 1490 net.cpp:94] Creating Layer drop6
I1003 02:03:07.166126 1490 net.cpp:435] drop6 <- fc6
I1003 02:03:07.166131 1490 net.cpp:396] drop6 -> fc6 (in-place)
I1003 02:03:07.166160 1490 net.cpp:144] Setting up drop6
I1003 02:03:07.166167 1490 net.cpp:151] Top shape: 1 4096 11 16 (720896)
I1003 02:03:07.166169 1490 net.cpp:159] Memory required for data: 65757040
I1003 02:03:07.166172 1490 layer_factory.hpp:77] Creating layer fc7
I1003 02:03:07.166182 1490 net.cpp:94] Creating Layer fc7
I1003 02:03:07.166184 1490 net.cpp:435] fc7 <- fc6
I1003 02:03:07.166215 1490 net.cpp:409] fc7 -> fc7
I1003 02:03:07.212101 1490 net.cpp:144] Setting up fc7
I1003 02:03:07.212134 1490 net.cpp:151] Top shape: 1 4096 11 16 (720896)
I1003 02:03:07.212138 1490 net.cpp:159] Memory required for data: 68640624
I1003 02:03:07.212147 1490 layer_factory.hpp:77] Creating layer relu7
I1003 02:03:07.212157 1490 net.cpp:94] Creating Layer relu7
I1003 02:03:07.212162 1490 net.cpp:435] relu7 <- fc7
I1003 02:03:07.212169 1490 net.cpp:396] relu7 -> fc7 (in-place)
I1003 02:03:07.212182 1490 net.cpp:144] Setting up relu7
I1003 02:03:07.212185 1490 net.cpp:151] Top shape: 1 4096 11 16 (720896)
I1003 02:03:07.212188 1490 net.cpp:159] Memory required for data: 71524208
I1003 02:03:07.212191 1490 layer_factory.hpp:77] Creating layer drop7
I1003 02:03:07.212198 1490 net.cpp:94] Creating Layer drop7
I1003 02:03:07.212200 1490 net.cpp:435] drop7 <- fc7
I1003 02:03:07.212204 1490 net.cpp:396] drop7 -> fc7 (in-place)
I1003 02:03:07.212231 1490 net.cpp:144] Setting up drop7
I1003 02:03:07.212236 1490 net.cpp:151] Top shape: 1 4096 11 16 (720896)
I1003 02:03:07.212239 1490 net.cpp:159] Memory required for data: 74407792
I1003 02:03:07.212241 1490 layer_factory.hpp:77] Creating layer score_fr
I1003 02:03:07.212252 1490 net.cpp:94] Creating Layer score_fr
I1003 02:03:07.212255 1490 net.cpp:435] score_fr <- fc7
I1003 02:03:07.212260 1490 net.cpp:409] score_fr -> score_fr
I1003 02:03:07.212563 1490 net.cpp:144] Setting up score_fr
I1003 02:03:07.212570 1490 net.cpp:151] Top shape: 1 21 11 16 (3696)
I1003 02:03:07.212574 1490 net.cpp:159] Memory required for data: 74422576
I1003 02:03:07.212579 1490 layer_factory.hpp:77] Creating layer upscore
I1003 02:03:07.212595 1490 net.cpp:94] Creating Layer upscore
I1003 02:03:07.212599 1490 net.cpp:435] upscore <- score_fr
I1003 02:03:07.212604 1490 net.cpp:409] upscore -> upscore
I1003 02:03:07.216276 1490 net.cpp:144] Setting up upscore
I1003 02:03:07.216284 1490 net.cpp:151] Top shape: 1 21 383 543 (4367349)
I1003 02:03:07.216287 1490 net.cpp:159] Memory required for data: 91891972
I1003 02:03:07.216298 1490 layer_factory.hpp:77] Creating layer score
I1003 02:03:07.216305 1490 net.cpp:94] Creating Layer score
I1003 02:03:07.216308 1490 net.cpp:435] score <- upscore
I1003 02:03:07.216313 1490 net.cpp:435] score <- data_data_0_split_1
I1003 02:03:07.216320 1490 net.cpp:409] score -> score
I1003 02:03:07.216344 1490 net.cpp:144] Setting up score
I1003 02:03:07.216349 1490 net.cpp:151] Top shape: 1 21 335 500 (3517500)
I1003 02:03:07.216352 1490 net.cpp:159] Memory required for data: 105961972
I1003 02:03:07.216356 1490 layer_factory.hpp:77] Creating layer loss
I1003 02:03:07.216362 1490 net.cpp:94] Creating Layer loss
I1003 02:03:07.216364 1490 net.cpp:435] loss <- score
I1003 02:03:07.216369 1490 net.cpp:435] loss <- label
I1003 02:03:07.216373 1490 net.cpp:409] loss -> loss
I1003 02:03:07.216383 1490 layer_factory.hpp:77] Creating layer loss
I1003 02:03:07.223633 1490 net.cpp:144] Setting up loss
I1003 02:03:07.223644 1490 net.cpp:151] Top shape: (1)
I1003 02:03:07.223647 1490 net.cpp:154] with loss weight 1
I1003 02:03:07.223665 1490 net.cpp:159] Memory required for data: 105961976
I1003 02:03:07.223670 1490 net.cpp:220] loss needs backward computation.
I1003 02:03:07.223677 1490 net.cpp:220] score needs backward computation.
I1003 02:03:07.223681 1490 net.cpp:220] upscore needs backward computation.
I1003 02:03:07.223685 1490 net.cpp:220] score_fr needs backward computation.
I1003 02:03:07.223688 1490 net.cpp:220] drop7 needs backward computation.
I1003 02:03:07.223692 1490 net.cpp:220] relu7 needs backward computation.
I1003 02:03:07.223695 1490 net.cpp:220] fc7 needs backward computation.
I1003 02:03:07.223698 1490 net.cpp:220] drop6 needs backward computation.
I1003 02:03:07.223701 1490 net.cpp:220] relu6 needs backward computation.
I1003 02:03:07.223706 1490 net.cpp:220] fc6 needs backward computation.
I1003 02:03:07.223708 1490 net.cpp:220] pool5 needs backward computation.
I1003 02:03:07.223712 1490 net.cpp:220] relu5 needs backward computation.
I1003 02:03:07.223736 1490 net.cpp:220] conv5 needs backward computation.
I1003 02:03:07.223740 1490 net.cpp:220] relu4 needs backward computation.
I1003 02:03:07.223743 1490 net.cpp:220] conv4 needs backward computation.
I1003 02:03:07.223747 1490 net.cpp:220] relu3 needs backward computation.
I1003 02:03:07.223748 1490 net.cpp:220] conv3 needs backward computation.
I1003 02:03:07.223752 1490 net.cpp:220] norm2 needs backward computation.
I1003 02:03:07.223757 1490 net.cpp:220] pool2 needs backward computation.
I1003 02:03:07.223759 1490 net.cpp:220] relu2 needs backward computation.
I1003 02:03:07.223762 1490 net.cpp:220] conv2 needs backward computation.
I1003 02:03:07.223767 1490 net.cpp:220] norm1 needs backward computation.
I1003 02:03:07.223769 1490 net.cpp:220] pool1 needs backward computation.
I1003 02:03:07.223773 1490 net.cpp:220] relu1 needs backward computation.
I1003 02:03:07.223775 1490 net.cpp:220] conv1 needs backward computation.
I1003 02:03:07.223780 1490 net.cpp:222] shift does not need backward computation.
I1003 02:03:07.223784 1490 net.cpp:222] label does not need backward computation.
I1003 02:03:07.223788 1490 net.cpp:222] data_data_0_split does not need backward computation.
I1003 02:03:07.223791 1490 net.cpp:222] data does not need backward computation.
I1003 02:03:07.223795 1490 net.cpp:264] This network produces output loss
I1003 02:03:07.223817 1490 net.cpp:284] Network initialization done.
I1003 02:03:07.224094 1490 solver.cpp:181] Creating test net (#0) specified by net file: train_val.prototxt
I1003 02:03:07.224128 1490 net.cpp:323] The NetState phase (1) differed from the phase (0) specified by a rule in layer data
I1003 02:03:07.224133 1490 net.cpp:323] The NetState phase (1) differed from the phase (0) specified by a rule in layer label
I1003 02:03:07.224272 1490 net.cpp:52] Initializing net from parameters:
state {
phase: TEST
}
layer {
name: "data"
type: "Data"
top: "data"
include {
phase: TEST
}
data_param {
source: "/home/ubuntu/digits/digits/jobs/20171002-224732-1d65/val_db/features"
batch_size: 1
backend: LMDB
}
}
layer {
name: "label"
type: "Data"
top: "label"
include {
phase: TEST
}
data_param {
source: "/home/ubuntu/digits/digits/jobs/20171002-224732-1d65/val_db/labels"
batch_size: 1
backend: LMDB
}
}
layer {
name: "shift"
type: "Power"
bottom: "data"
top: "data_preprocessed"
power_param {
shift: -116
}
}
layer {
name: "conv1"
type: "Convolution"
bottom: "data_preprocessed"
top: "conv1"
convolution_param {
num_output: 96
pad: 100
kernel_size: 11
group: 1
stride: 4
}
}
layer {
name: "relu1"
type: "ReLU"
bottom: "conv1"
top: "conv1"
}
layer {
name: "pool1"
type: "Pooling"
bottom: "conv1"
top: "pool1"
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: "norm1"
type: "LRN"
bottom: "pool1"
top: "norm1"
lrn_param {
local_size: 5
alpha: 0.0001
beta: 0.75
}
}
layer {
name: "conv2"
type: "Convolution"
bottom: "norm1"
top: "conv2"
convolution_param {
num_output: 256
pad: 2
kernel_size: 5
group: 2
stride: 1
}
}
layer {
name: "relu2"
type: "ReLU"
bottom: "conv2"
top: "conv2"
}
layer {
name: "pool2"
type: "Pooling"
bottom: "conv2"
top: "pool2"
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: "norm2"
type: "LRN"
bottom: "pool2"
top: "norm2"
lrn_param {
local_size: 5
alpha: 0.0001
beta: 0.75
}
}
layer {
name: "conv3"
type: "Convolution"
bottom: "norm2"
top: "conv3"
convolution_param {
num_output: 384
pad: 1
kernel_size: 3
group: 1
stride: 1
}
}
layer {
name: "relu3"
type: "ReLU"
bottom: "conv3"
top: "conv3"
}
layer {
name: "conv4"
type: "Convolution"
bottom: "conv3"
top: "conv4"
convolution_param {
num_output: 384
pad: 1
kernel_size: 3
group: 2
stride: 1
}
}
layer {
name: "relu4"
type: "ReLU"
bottom: "conv4"
top: "conv4"
}
layer {
name: "conv5"
type: "Convolution"
bottom: "conv4"
top: "conv5"
convolution_param {
num_output: 256
pad: 1
kernel_size: 3
group: 2
stride: 1
}
}
layer {
name: "relu5"
type: "ReLU"
bottom: "conv5"
top: "conv5"
}
layer {
name: "pool5"
type: "Pooling"
bottom: "conv5"
top: "pool5"
pooling_param {
pool: MAX
kernel_size: 3
stride: 2
}
}
layer {
name: "fc6"
type: "Convolution"
bottom: "pool5"
top: "fc6"
convolution_param {
num_output: 4096
pad: 0
kernel_size: 6
group: 1
stride: 1
}
}
layer {
name: "relu6"
type: "ReLU"
bottom: "fc6"
top: "fc6"
}
layer {
name: "drop6"
type: "Dropout"
bottom: "fc6"
top: "fc6"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "fc7"
type: "Convolution"
bottom: "fc6"
top: "fc7"
convolution_param {
num_output: 4096
pad: 0
kernel_size: 1
group: 1
stride: 1
}
}
layer {
name: "relu7"
type: "ReLU"
bottom: "fc7"
top: "fc7"
}
layer {
name: "drop7"
type: "Dropout"
bottom: "fc7"
top: "fc7"
dropout_param {
dropout_ratio: 0.5
}
}
layer {
name: "score_fr"
type: "Convolution"
bottom: "fc7"
top: "score_fr"
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
convolution_param {
num_output: 21
pad: 0
kernel_size: 1
}
}
layer {
name: "upscore"
type: "Deconvolution"
bottom: "score_fr"
top: "upscore"
param {
lr_mult: 0
}
convolution_param {
num_output: 21
bias_term: false
kernel_size: 63
group: 21
stride: 32
weight_filler {
type: "bilinear"
}
}
}
layer {
name: "score"
type: "Crop"
bottom: "upscore"
bottom: "data"
top: "score"
crop_param {
axis: 2
offset: 18
}
}
layer {
name: "loss"
type: "SoftmaxWithLoss"
bottom: "score"
bottom: "label"
top: "loss"
loss_param {
ignore_label: 255
normalize: true
}
}
layer {
name: "accuracy"
type: "Accuracy"
bottom: "score"
bottom: "label"
top: "accuracy"
include {
phase: TEST
}
accuracy_param {
ignore_label: 255
}
}
I1003 02:03:07.224380 1490 layer_factory.hpp:77] Creating layer data
I1003 02:03:07.224632 1490 net.cpp:94] Creating Layer data
I1003 02:03:07.224643 1490 net.cpp:409] data -> data
I1003 02:03:07.227438 1505 db_lmdb.cpp:35] Opened lmdb /home/ubuntu/digits/digits/jobs/20171002-224732-1d65/val_db/features
I1003 02:03:07.241772 1490 data_layer.cpp:78] ReshapePrefetch 1, 3, 332, 500
I1003 02:03:07.241822 1490 data_layer.cpp:83] output data size: 1,3,332,500
I1003 02:03:07.247119 1490 net.cpp:144] Setting up data
I1003 02:03:07.247138 1490 net.cpp:151] Top shape: 1 3 332 500 (498000)
I1003 02:03:07.247144 1490 net.cpp:159] Memory required for data: 1992000
I1003 02:03:07.247150 1490 layer_factory.hpp:77] Creating layer data_data_0_split
I1003 02:03:07.247164 1490 net.cpp:94] Creating Layer data_data_0_split
I1003 02:03:07.247170 1490 net.cpp:435] data_data_0_split <- data
I1003 02:03:07.247179 1490 net.cpp:409] data_data_0_split -> data_data_0_split_0
I1003 02:03:07.247191 1490 net.cpp:409] data_data_0_split -> data_data_0_split_1
I1003 02:03:07.247283 1490 net.cpp:144] Setting up data_data_0_split
I1003 02:03:07.247297 1490 net.cpp:151] Top shape: 1 3 332 500 (498000)
I1003 02:03:07.247303 1490 net.cpp:151] Top shape: 1 3 332 500 (498000)
I1003 02:03:07.247308 1490 net.cpp:159] Memory required for data: 5976000
I1003 02:03:07.247314 1490 layer_factory.hpp:77] Creating layer label
I1003 02:03:07.247501 1490 net.cpp:94] Creating Layer label
I1003 02:03:07.247515 1490 net.cpp:409] label -> label
I1003 02:03:07.250566 1511 db_lmdb.cpp:35] Opened lmdb /home/ubuntu/digits/digits/jobs/20171002-224732-1d65/val_db/labels
I1003 02:03:07.253024 1490 data_layer.cpp:78] ReshapePrefetch 1, 1, 332, 500
I1003 02:03:07.253098 1490 data_layer.cpp:83] output data size: 1,1,332,500
I1003 02:03:07.258059 1490 net.cpp:144] Setting up label
I1003 02:03:07.258090 1490 net.cpp:151] Top shape: 1 1 332 500 (166000)
I1003 02:03:07.258098 1490 net.cpp:159] Memory required for data: 6640000
I1003 02:03:07.258103 1490 layer_factory.hpp:77] Creating layer label_label_0_split
I1003 02:03:07.258112 1490 net.cpp:94] Creating Layer label_label_0_split
I1003 02:03:07.258117 1490 net.cpp:435] label_label_0_split <- label
I1003 02:03:07.258127 1490 net.cpp:409] label_label_0_split -> label_label_0_split_0
I1003 02:03:07.258139 1490 net.cpp:409] label_label_0_split -> label_label_0_split_1
I1003 02:03:07.258258 1490 net.cpp:144] Setting up label_label_0_split
I1003 02:03:07.258270 1490 net.cpp:151] Top shape: 1 1 332 500 (166000)
I1003 02:03:07.258277 1490 net.cpp:151] Top shape: 1 1 332 500 (166000)
I1003 02:03:07.258281 1490 net.cpp:159] Memory required for data: 7968000
I1003 02:03:07.258285 1490 layer_factory.hpp:77] Creating layer shift
I1003 02:03:07.258296 1490 net.cpp:94] Creating Layer shift
I1003 02:03:07.258302 1490 net.cpp:435] shift <- data_data_0_split_0
I1003 02:03:07.258311 1490 net.cpp:409] shift -> data_preprocessed
I1003 02:03:07.258340 1490 net.cpp:144] Setting up shift
I1003 02:03:07.258347 1490 net.cpp:151] Top shape: 1 3 332 500 (498000)
I1003 02:03:07.258352 1490 net.cpp:159] Memory required for data: 9960000
I1003 02:03:07.258357 1490 layer_factory.hpp:77] Creating layer conv1
I1003 02:03:07.258365 1490 net.cpp:94] Creating Layer conv1
I1003 02:03:07.258370 1490 net.cpp:435] conv1 <- data_preprocessed
I1003 02:03:07.258380 1490 net.cpp:409] conv1 -> conv1
I1003 02:03:07.258749 1490 net.cpp:144] Setting up conv1
I1003 02:03:07.258759 1490 net.cpp:151] Top shape: 1 96 131 173 (2175648)
I1003 02:03:07.258764 1490 net.cpp:159] Memory required for data: 18662592
I1003 02:03:07.258779 1490 layer_factory.hpp:77] Creating layer relu1
I1003 02:03:07.258785 1490 net.cpp:94] Creating Layer relu1
I1003 02:03:07.258790 1490 net.cpp:435] relu1 <- conv1
I1003 02:03:07.258796 1490 net.cpp:396] relu1 -> conv1 (in-place)
I1003 02:03:07.258806 1490 net.cpp:144] Setting up relu1
I1003 02:03:07.258812 1490 net.cpp:151] Top shape: 1 96 131 173 (2175648)
I1003 02:03:07.258816 1490 net.cpp:159] Memory required for data: 27365184
I1003 02:03:07.258821 1490 layer_factory.hpp:77] Creating layer pool1
I1003 02:03:07.258827 1490 net.cpp:94] Creating Layer pool1
I1003 02:03:07.258831 1490 net.cpp:435] pool1 <- conv1
I1003 02:03:07.258837 1490 net.cpp:409] pool1 -> pool1
I1003 02:03:07.258888 1490 net.cpp:144] Setting up pool1
I1003 02:03:07.258896 1490 net.cpp:151] Top shape: 1 96 65 86 (536640)
I1003 02:03:07.258900 1490 net.cpp:159] Memory required for data: 29511744
I1003 02:03:07.258904 1490 layer_factory.hpp:77] Creating layer norm1
I1003 02:03:07.258914 1490 net.cpp:94] Creating Layer norm1
I1003 02:03:07.258920 1490 net.cpp:435] norm1 <- pool1
I1003 02:03:07.258927 1490 net.cpp:409] norm1 -> norm1
I1003 02:03:07.259143 1490 net.cpp:144] Setting up norm1
I1003 02:03:07.259153 1490 net.cpp:151] Top shape: 1 96 65 86 (536640)
I1003 02:03:07.259160 1490 net.cpp:159] Memory required for data: 31658304
I1003 02:03:07.259163 1490 layer_factory.hpp:77] Creating layer conv2
I1003 02:03:07.259173 1490 net.cpp:94] Creating Layer conv2
I1003 02:03:07.259178 1490 net.cpp:435] conv2 <- norm1
I1003 02:03:07.259187 1490 net.cpp:409] conv2 -> conv2
I1003 02:03:07.260309 1490 net.cpp:144] Setting up conv2
I1003 02:03:07.260325 1490 net.cpp:151] Top shape: 1 256 65 86 (1431040)
I1003 02:03:07.260330 1490 net.cpp:159] Memory required for data: 37382464
I1003 02:03:07.260344 1490 layer_factory.hpp:77] Creating layer relu2
I1003 02:03:07.260352 1490 net.cpp:94] Creating Layer relu2
I1003 02:03:07.260357 1490 net.cpp:435] relu2 <- conv2
I1003 02:03:07.260365 1490 net.cpp:396] relu2 -> conv2 (in-place)
I1003 02:03:07.260375 1490 net.cpp:144] Setting up relu2
I1003 02:03:07.260381 1490 net.cpp:151] Top shape: 1 256 65 86 (1431040)
I1003 02:03:07.260385 1490 net.cpp:159] Memory required for data: 43106624
I1003 02:03:07.260390 1490 layer_factory.hpp:77] Creating layer pool2
I1003 02:03:07.260412 1490 net.cpp:94] Creating Layer pool2
I1003 02:03:07.260417 1490 net.cpp:435] pool2 <- conv2
I1003 02:03:07.260424 1490 net.cpp:409] pool2 -> pool2
I1003 02:03:07.260474 1490 net.cpp:144] Setting up pool2
I1003 02:03:07.260483 1490 net.cpp:151] Top shape: 1 256 32 43 (352256)
I1003 02:03:07.260488 1490 net.cpp:159] Memory required for data: 44515648
I1003 02:03:07.260493 1490 layer_factory.hpp:77] Creating layer norm2
I1003 02:03:07.260499 1490 net.cpp:94] Creating Layer norm2
I1003 02:03:07.260505 1490 net.cpp:435] norm2 <- pool2
I1003 02:03:07.260511 1490 net.cpp:409] norm2 -> norm2
I1003 02:03:07.260556 1490 net.cpp:144] Setting up norm2
I1003 02:03:07.260565 1490 net.cpp:151] Top shape: 1 256 32 43 (352256)
I1003 02:03:07.260571 1490 net.cpp:159] Memory required for data: 45924672
I1003 02:03:07.260574 1490 layer_factory.hpp:77] Creating layer conv3
I1003 02:03:07.260583 1490 net.cpp:94] Creating Layer conv3
I1003 02:03:07.260588 1490 net.cpp:435] conv3 <- norm2
I1003 02:03:07.260596 1490 net.cpp:409] conv3 -> conv3
I1003 02:03:07.262853 1490 net.cpp:144] Setting up conv3
I1003 02:03:07.262869 1490 net.cpp:151] Top shape: 1 384 32 43 (528384)
I1003 02:03:07.262874 1490 net.cpp:159] Memory required for data: 48038208
I1003 02:03:07.262887 1490 layer_factory.hpp:77] Creating layer relu3
I1003 02:03:07.262895 1490 net.cpp:94] Creating Layer relu3
I1003 02:03:07.262900 1490 net.cpp:435] relu3 <- conv3
I1003 02:03:07.262908 1490 net.cpp:396] relu3 -> conv3 (in-place)
I1003 02:03:07.262919 1490 net.cpp:144] Setting up relu3
I1003 02:03:07.262926 1490 net.cpp:151] Top shape: 1 384 32 43 (528384)
I1003 02:03:07.262930 1490 net.cpp:159] Memory required for data: 50151744
I1003 02:03:07.262934 1490 layer_factory.hpp:77] Creating layer conv4
I1003 02:03:07.262944 1490 net.cpp:94] Creating Layer conv4
I1003 02:03:07.262949 1490 net.cpp:435] conv4 <- conv3
I1003 02:03:07.262956 1490 net.cpp:409] conv4 -> conv4
I1003 02:03:07.264798 1490 net.cpp:144] Setting up conv4
I1003 02:03:07.264816 1490 net.cpp:151] Top shape: 1 384 32 43 (528384)
I1003 02:03:07.264820 1490 net.cpp:159] Memory required for data: 52265280
I1003 02:03:07.264829 1490 layer_factory.hpp:77] Creating layer relu4
I1003 02:03:07.264840 1490 net.cpp:94] Creating Layer relu4
I1003 02:03:07.264847 1490 net.cpp:435] relu4 <- conv4
I1003 02:03:07.264853 1490 net.cpp:396] relu4 -> conv4 (in-place)
I1003 02:03:07.264864 1490 net.cpp:144] Setting up relu4
I1003 02:03:07.264870 1490 net.cpp:151] Top shape: 1 384 32 43 (528384)
I1003 02:03:07.264876 1490 net.cpp:159] Memory required for data: 54378816
I1003 02:03:07.264880 1490 layer_factory.hpp:77] Creating layer conv5
I1003 02:03:07.264889 1490 net.cpp:94] Creating Layer conv5
I1003 02:03:07.264894 1490 net.cpp:435] conv5 <- conv4
I1003 02:03:07.264901 1490 net.cpp:409] conv5 -> conv5
I1003 02:03:07.267729 1490 net.cpp:144] Setting up conv5
I1003 02:03:07.267746 1490 net.cpp:151] Top shape: 1 256 32 43 (352256)
I1003 02:03:07.267751 1490 net.cpp:159] Memory required for data: 55787840
I1003 02:03:07.267766 1490 layer_factory.hpp:77] Creating layer relu5
I1003 02:03:07.267774 1490 net.cpp:94] Creating Layer relu5
I1003 02:03:07.267779 1490 net.cpp:435] relu5 <- conv5
I1003 02:03:07.267787 1490 net.cpp:396] relu5 -> conv5 (in-place)
I1003 02:03:07.267799 1490 net.cpp:144] Setting up relu5
I1003 02:03:07.267805 1490 net.cpp:151] Top shape: 1 256 32 43 (352256)
I1003 02:03:07.267809 1490 net.cpp:159] Memory required for data: 57196864
I1003 02:03:07.267814 1490 layer_factory.hpp:77] Creating layer pool5
I1003 02:03:07.267822 1490 net.cpp:94] Creating Layer pool5
I1003 02:03:07.267827 1490 net.cpp:435] pool5 <- conv5
I1003 02:03:07.267833 1490 net.cpp:409] pool5 -> pool5
I1003 02:03:07.267887 1490 net.cpp:144] Setting up pool5
I1003 02:03:07.267897 1490 net.cpp:151] Top shape: 1 256 16 21 (86016)
I1003 02:03:07.267901 1490 net.cpp:159] Memory required for data: 57540928
I1003 02:03:07.267909 1490 layer_factory.hpp:77] Creating layer fc6
I1003 02:03:07.267937 1490 net.cpp:94] Creating Layer fc6
I1003 02:03:07.267944 1490 net.cpp:435] fc6 <- pool5
I1003 02:03:07.267952 1490 net.cpp:409] fc6 -> fc6
I1003 02:03:07.370440 1490 net.cpp:144] Setting up fc6
I1003 02:03:07.370471 1490 net.cpp:151] Top shape: 1 4096 11 16 (720896)
I1003 02:03:07.370476 1490 net.cpp:159] Memory required for data: 60424512
I1003 02:03:07.370486 1490 layer_factory.hpp:77] Creating layer relu6
I1003 02:03:07.370496 1490 net.cpp:94] Creating Layer relu6
I1003 02:03:07.370499 1490 net.cpp:435] relu6 <- fc6
I1003 02:03:07.370507 1490 net.cpp:396] relu6 -> fc6 (in-place)
I1003 02:03:07.370518 1490 net.cpp:144] Setting up relu6
I1003 02:03:07.370522 1490 net.cpp:151] Top shape: 1 4096 11 16 (720896)
I1003 02:03:07.370524 1490 net.cpp:159] Memory required for data: 63308096
I1003 02:03:07.370527 1490 layer_factory.hpp:77] Creating layer drop6
I1003 02:03:07.370533 1490 net.cpp:94] Creating Layer drop6
I1003 02:03:07.370537 1490 net.cpp:435] drop6 <- fc6
I1003 02:03:07.370540 1490 net.cpp:396] drop6 -> fc6 (in-place)
I1003 02:03:07.370568 1490 net.cpp:144] Setting up drop6
I1003 02:03:07.370573 1490 net.cpp:151] Top shape: 1 4096 11 16 (720896)
I1003 02:03:07.370575 1490 net.cpp:159] Memory required for data: 66191680
I1003 02:03:07.370579 1490 layer_factory.hpp:77] Creating layer fc7
I1003 02:03:07.370586 1490 net.cpp:94] Creating Layer fc7
I1003 02:03:07.370589 1490 net.cpp:435] fc7 <- fc6
I1003 02:03:07.370594 1490 net.cpp:409] fc7 -> fc7
I1003 02:03:07.415993 1490 net.cpp:144] Setting up fc7
I1003 02:03:07.416018 1490 net.cpp:151] Top shape: 1 4096 11 16 (720896)
I1003 02:03:07.416020 1490 net.cpp:159] Memory required for data: 69075264
I1003 02:03:07.416028 1490 layer_factory.hpp:77] Creating layer relu7
I1003 02:03:07.416038 1490 net.cpp:94] Creating Layer relu7
I1003 02:03:07.416043 1490 net.cpp:435] relu7 <- fc7
I1003 02:03:07.416049 1490 net.cpp:396] relu7 -> fc7 (in-place)
I1003 02:03:07.416061 1490 net.cpp:144] Setting up relu7
I1003 02:03:07.416064 1490 net.cpp:151] Top shape: 1 4096 11 16 (720896)
I1003 02:03:07.416067 1490 net.cpp:159] Memory required for data: 71958848
I1003 02:03:07.416070 1490 layer_factory.hpp:77] Creating layer drop7
I1003 02:03:07.416076 1490 net.cpp:94] Creating Layer drop7
I1003 02:03:07.416079 1490 net.cpp:435] drop7 <- fc7
I1003 02:03:07.416085 1490 net.cpp:396] drop7 -> fc7 (in-place)
I1003 02:03:07.416111 1490 net.cpp:144] Setting up drop7
I1003 02:03:07.416115 1490 net.cpp:151] Top shape: 1 4096 11 16 (720896)
I1003 02:03:07.416118 1490 net.cpp:159] Memory required for data: 74842432
I1003 02:03:07.416121 1490 layer_factory.hpp:77] Creating layer score_fr
I1003 02:03:07.416131 1490 net.cpp:94] Creating Layer score_fr
I1003 02:03:07.416132 1490 net.cpp:435] score_fr <- fc7
I1003 02:03:07.416138 1490 net.cpp:409] score_fr -> score_fr
I1003 02:03:07.416460 1490 net.cpp:144] Setting up score_fr
I1003 02:03:07.416466 1490 net.cpp:151] Top shape: 1 21 11 16 (3696)
I1003 02:03:07.416468 1490 net.cpp:159] Memory required for data: 74857216
I1003 02:03:07.416473 1490 layer_factory.hpp:77] Creating layer upscore
I1003 02:03:07.416481 1490 net.cpp:94] Creating Layer upscore
I1003 02:03:07.416484 1490 net.cpp:435] upscore <- score_fr
I1003 02:03:07.416491 1490 net.cpp:409] upscore -> upscore
I1003 02:03:07.420737 1490 net.cpp:144] Setting up upscore
I1003 02:03:07.420749 1490 net.cpp:151] Top shape: 1 21 383 543 (4367349)
I1003 02:03:07.420753 1490 net.cpp:159] Memory required for data: 92326612
I1003 02:03:07.420763 1490 layer_factory.hpp:77] Creating layer score
I1003 02:03:07.420769 1490 net.cpp:94] Creating Layer score
I1003 02:03:07.420773 1490 net.cpp:435] score <- upscore
I1003 02:03:07.420778 1490 net.cpp:435] score <- data_data_0_split_1
I1003 02:03:07.420783 1490 net.cpp:409] score -> score
I1003 02:03:07.420809 1490 net.cpp:144] Setting up score
I1003 02:03:07.420814 1490 net.cpp:151] Top shape: 1 21 332 500 (3486000)
I1003 02:03:07.420817 1490 net.cpp:159] Memory required for data: 106270612
I1003 02:03:07.420835 1490 layer_factory.hpp:77] Creating layer score_score_0_split
I1003 02:03:07.420840 1490 net.cpp:94] Creating Layer score_score_0_split
I1003 02:03:07.420845 1490 net.cpp:435] score_score_0_split <- score
I1003 02:03:07.420850 1490 net.cpp:409] score_score_0_split -> score_score_0_split_0
I1003 02:03:07.420855 1490 net.cpp:409] score_score_0_split -> score_score_0_split_1
I1003 02:03:07.420892 1490 net.cpp:144] Setting up score_score_0_split
I1003 02:03:07.420897 1490 net.cpp:151] Top shape: 1 21 332 500 (3486000)
I1003 02:03:07.420899 1490 net.cpp:151] Top shape: 1 21 332 500 (3486000)
I1003 02:03:07.420902 1490 net.cpp:159] Memory required for data: 134158612
I1003 02:03:07.420904 1490 layer_factory.hpp:77] Creating layer loss
I1003 02:03:07.420912 1490 net.cpp:94] Creating Layer loss
I1003 02:03:07.420914 1490 net.cpp:435] loss <- score_score_0_split_0
I1003 02:03:07.420918 1490 net.cpp:435] loss <- label_label_0_split_0
I1003 02:03:07.420923 1490 net.cpp:409] loss -> loss
I1003 02:03:07.420930 1490 layer_factory.hpp:77] Creating layer loss
I1003 02:03:07.427841 1490 net.cpp:144] Setting up loss
I1003 02:03:07.427852 1490 net.cpp:151] Top shape: (1)
I1003 02:03:07.427855 1490 net.cpp:154] with loss weight 1
I1003 02:03:07.427867 1490 net.cpp:159] Memory required for data: 134158616
I1003 02:03:07.427870 1490 layer_factory.hpp:77] Creating layer accuracy
I1003 02:03:07.427878 1490 net.cpp:94] Creating Layer accuracy
I1003 02:03:07.427882 1490 net.cpp:435] accuracy <- score_score_0_split_1
I1003 02:03:07.427887 1490 net.cpp:435] accuracy <- label_label_0_split_1
I1003 02:03:07.427892 1490 net.cpp:409] accuracy -> accuracy
I1003 02:03:07.427907 1490 net.cpp:144] Setting up accuracy
I1003 02:03:07.427912 1490 net.cpp:151] Top shape: (1)
I1003 02:03:07.427916 1490 net.cpp:159] Memory required for data: 134158620
I1003 02:03:07.427917 1490 net.cpp:222] accuracy does not need backward computation.
I1003 02:03:07.427922 1490 net.cpp:220] loss needs backward computation.
I1003 02:03:07.427925 1490 net.cpp:220] score_score_0_split needs backward computation.
I1003 02:03:07.427929 1490 net.cpp:220] score needs backward computation.
I1003 02:03:07.427933 1490 net.cpp:220] upscore needs backward computation.
I1003 02:03:07.427937 1490 net.cpp:220] score_fr needs backward computation.
I1003 02:03:07.427939 1490 net.cpp:220] drop7 needs backward computation.
I1003 02:03:07.427943 1490 net.cpp:220] relu7 needs backward computation.
I1003 02:03:07.427945 1490 net.cpp:220] fc7 needs backward computation.
I1003 02:03:07.427949 1490 net.cpp:220] drop6 needs backward computation.
I1003 02:03:07.427953 1490 net.cpp:220] relu6 needs backward computation.
I1003 02:03:07.427955 1490 net.cpp:220] fc6 needs backward computation.
I1003 02:03:07.427959 1490 net.cpp:220] pool5 needs backward computation.
I1003 02:03:07.427963 1490 net.cpp:220] relu5 needs backward computation.
I1003 02:03:07.427965 1490 net.cpp:220] conv5 needs backward computation.
I1003 02:03:07.427969 1490 net.cpp:220] relu4 needs backward computation.
I1003 02:03:07.427973 1490 net.cpp:220] conv4 needs backward computation.
I1003 02:03:07.427974 1490 net.cpp:220] relu3 needs backward computation.
I1003 02:03:07.427978 1490 net.cpp:220] conv3 needs backward computation.
I1003 02:03:07.427981 1490 net.cpp:220] norm2 needs backward computation.
I1003 02:03:07.427985 1490 net.cpp:220] pool2 needs backward computation.
I1003 02:03:07.427989 1490 net.cpp:220] relu2 needs backward computation.
I1003 02:03:07.427991 1490 net.cpp:220] conv2 needs backward computation.
I1003 02:03:07.427995 1490 net.cpp:220] norm1 needs backward computation.
I1003 02:03:07.427999 1490 net.cpp:220] pool1 needs backward computation.
I1003 02:03:07.428001 1490 net.cpp:220] relu1 needs backward computation.
I1003 02:03:07.428004 1490 net.cpp:220] conv1 needs backward computation.
I1003 02:03:07.428009 1490 net.cpp:222] shift does not need backward computation.
I1003 02:03:07.428012 1490 net.cpp:222] label_label_0_split does not need backward computation.
I1003 02:03:07.428026 1490 net.cpp:222] label does not need backward computation.
I1003 02:03:07.428030 1490 net.cpp:222] data_data_0_split does not need backward computation.
I1003 02:03:07.428035 1490 net.cpp:222] data does not need backward computation.
I1003 02:03:07.428037 1490 net.cpp:264] This network produces output accuracy
I1003 02:03:07.428040 1490 net.cpp:264] This network produces output loss
I1003 02:03:07.428061 1490 net.cpp:284] Network initialization done.
I1003 02:03:07.428146 1490 solver.cpp:60] Solver scaffolding done.
I1003 02:03:07.428712 1490 caffe.cpp:231] Starting Optimization
I1003 02:03:07.428716 1490 solver.cpp:304] Solving
I1003 02:03:07.428719 1490 solver.cpp:305] Learning Rate Policy: step
I1003 02:03:07.432713 1490 solver.cpp:362] Iteration 0, Testing net (#0)
I1003 02:11:29.384356 1490 solver.cpp:429] Test net output #0: accuracy = 0.0070898
I1003 02:11:29.384430 1490 solver.cpp:429] Test net output #1: loss = 3.04448 (* 1 = 3.04448 loss)
I1003 02:11:29.583273 1490 solver.cpp:242] Iteration 0 (0 iter/s, 502.167s/183 iter), loss = 3.04452
I1003 02:11:29.583302 1490 solver.cpp:261] Train net output #0: loss = 3.04452 (* 1 = 3.04452 loss)
I1003 02:11:29.583317 1490 sgd_solver.cpp:106] Iteration 0, lr = 0.0001
F1003 02:11:31.702250 1490 gpu_memory.hpp:27] Out of memory: failed to allocate 150994944 bytes on device 0
*** Check failure stack trace: ***
@ 0x7f5b87b9f5cd google::LogMessage::Fail()
@ 0x7f5b87ba1433 google::LogMessage::SendToLog()
@ 0x7f5b87b9f15b google::LogMessage::Flush()
@ 0x7f5b87ba1e1e google::LogMessageFatal::~LogMessageFatal()
@ 0x7f5b880fd8fd caffe::GPUMemory::allocate<>()
@ 0x7f5b8818dbd4 caffe::CuDNNConvolutionLayer<>::FindExConvAlgo()
@ 0x7f5b8819268d caffe::CuDNNConvolutionLayer<>::Reshape()
@ 0x7f5b882fffb8 caffe::Net<>::ForwardFromTo()
@ 0x7f5b88300367 caffe::Net<>::Forward()
@ 0x7f5b882d1dcc caffe::Solver<>::Step()
@ 0x7f5b882d2a49 caffe::Solver<>::Solve()
@ 0x40c757 train()
@ 0x4086e8 main
@ 0x7f5b865f1830 __libc_start_main
@ 0x408e59 _start
@ (nil) (unknown)

@DarylWM
Copy link
Author

DarylWM commented Oct 5, 2017

It would be great if simple questions like this could be handled in the example's README, but I didn't see it there. What spec GPU was it tested on?

@ethantang95
Copy link
Contributor

Hmm, that's weird. When I ran it before, on a single GTX1080, it was able to handle it without fail. Are you sure your GPU was not in use at the time of running this?

@DarylWM
Copy link
Author

DarylWM commented Oct 8, 2017 via email

@useebear
Copy link

You might need to not use the GPU for display at the same time. I came across the same problem. But when I changed the GPU, it works. It needs 11.2GB GPU memory for training.

"F1011 18:03:00.939154 12931 gpu_memory.hpp:27] Out of memory: failed to allocate 150994944 bytes on device 0"

@totti0223
Copy link

came up to the solution same with useebear and worked fine.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants