Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- sina@sina-Z170X-UD3:~$ cd code/caffe/
- sina@sina-Z170X-UD3:~/code/caffe$ ./build/tools/caffe train --solver=examples/alexnetfinetune/solver_sina.prototxt
- I0803 12:10:35.137758 2175 caffe.cpp:218] Using GPUs 0
- I0803 12:10:35.359722 2175 caffe.cpp:223] GPU 0: GeForce GTX 1070
- I0803 12:10:38.265421 2175 solver.cpp:44] Initializing solver from parameters:
- test_iter: 280
- test_interval: 660
- base_lr: 0.001
- display: 20
- max_iter: 5000
- lr_policy: "step"
- gamma: 0.1
- momentum: 0.9
- weight_decay: 0.0005
- stepsize: 100000
- snapshot: 4999
- snapshot_prefix: "models/bvlc_alexnet/caffe_alexnet_sinatrain"
- solver_mode: GPU
- device_id: 0
- net: "examples/alexnetfinetune/alexnetsade-bn.prototxt"
- train_state {
- level: 0
- stage: ""
- }
- type: "SGD"
- I0803 12:10:38.265714 2175 solver.cpp:87] Creating training net from net file: examples/alexnetfinetune/alexnetsade-bn.prototxt
- I0803 12:10:38.286698 2175 upgrade_proto.cpp:77] Attempting to upgrade batch norm layers using deprecated params: examples/alexnetfinetune/alexnetsade-bn.prototxt
- I0803 12:10:38.286759 2175 upgrade_proto.cpp:80] Successfully upgraded batch norm layers using deprecated params.
- I0803 12:10:38.286895 2175 net.cpp:296] The NetState phase (0) differed from the phase (1) specified by a rule in layer data
- I0803 12:10:38.286932 2175 net.cpp:296] The NetState phase (0) differed from the phase (1) specified by a rule in layer accuracy
- I0803 12:10:38.287377 2175 net.cpp:53] Initializing net from parameters:
- name: "AlexNet"
- state {
- phase: TRAIN
- level: 0
- stage: ""
- }
- layer {
- name: "data"
- type: "Data"
- top: "data"
- top: "label"
- include {
- phase: TRAIN
- }
- transform_param {
- mirror: true
- crop_size: 227
- mean_file: "examples/Mydataset_train_lmdb/mean_imagetest.binaryproto"
- }
- data_param {
- source: "examples/Mydataset_train_lmdb"
- batch_size: 10
- backend: LMDB
- }
- }
- layer {
- name: "conv1"
- type: "Convolution"
- bottom: "data"
- top: "conv1"
- param {
- lr_mult: 1
- }
- convolution_param {
- num_output: 64
- bias_term: true
- pad: 1
- kernel_size: 3
- stride: 1
- weight_filler {
- type: "xavier"
- }
- }
- }
- layer {
- name: "bn1"
- type: "BatchNorm"
- bottom: "conv1"
- top: "conv1"
- param {
- lr_mult: 0
- decay_mult: 0
- }
- param {
- lr_mult: 0
- decay_mult: 0
- }
- param {
- lr_mult: 0
- decay_mult: 0
- }
- }
- layer {
- name: "scale1"
- type: "Scale"
- bottom: "conv1"
- top: "conv1"
- scale_param {
- bias_term: true
- }
- }
- layer {
- name: "relu1"
- type: "ReLU"
- bottom: "conv1"
- top: "conv1"
- }
- layer {
- name: "norm1"
- type: "LRN"
- bottom: "conv1"
- top: "norm1"
- lrn_param {
- local_size: 5
- alpha: 0.0001
- beta: 0.75
- }
- }
- layer {
- name: "pool1"
- type: "Pooling"
- bottom: "norm1"
- top: "pool1"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 2
- }
- }
- layer {
- name: "conv2"
- type: "Convolution"
- bottom: "pool1"
- top: "conv2"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 256
- pad: 2
- kernel_size: 5
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu2"
- type: "ReLU"
- bottom: "conv2"
- top: "conv2"
- }
- layer {
- name: "norm2"
- type: "LRN"
- bottom: "conv2"
- top: "norm2"
- lrn_param {
- local_size: 5
- alpha: 0.0001
- beta: 0.75
- }
- }
- layer {
- name: "pool2"
- type: "Pooling"
- bottom: "norm2"
- top: "pool2"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 2
- }
- }
- layer {
- name: "conv3"
- type: "Convolution"
- bottom: "pool2"
- top: "conv3"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 384
- pad: 1
- kernel_size: 3
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0
- }
- }
- }
- layer {
- name: "relu3"
- type: "ReLU"
- bottom: "conv3"
- top: "conv3"
- }
- layer {
- name: "conv4"
- type: "Convolution"
- bottom: "conv3"
- top: "conv4"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 384
- pad: 1
- kernel_size: 3
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu4"
- type: "ReLU"
- bottom: "conv4"
- top: "conv4"
- }
- layer {
- name: "conv5"
- type: "Convolution"
- bottom: "conv4"
- top: "conv5"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 256
- pad: 1
- kernel_size: 3
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu5"
- type: "ReLU"
- bottom: "conv5"
- top: "conv5"
- }
- layer {
- name: "pool5"
- type: "Pooling"
- bottom: "conv5"
- top: "pool5"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 2
- }
- }
- layer {
- name: "fc6"
- type: "InnerProduct"
- bottom: "pool5"
- top: "fc6"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- inner_product_param {
- num_output: 4096
- weight_filler {
- type: "gaussian"
- std: 0.005
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu6"
- type: "ReLU"
- bottom: "fc6"
- top: "fc6"
- }
- layer {
- name: "drop6"
- type: "Dropout"
- bottom: "fc6"
- top: "fc6"
- dropout_param {
- dropout_ratio: 0.5
- }
- }
- layer {
- name: "fc7"
- type: "InnerProduct"
- bottom: "fc6"
- top: "fc7"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- inner_product_param {
- num_output: 4096
- weight_filler {
- type: "gaussian"
- std: 0.005
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu7"
- type: "ReLU"
- bottom: "fc7"
- top: "fc7"
- }
- layer {
- name: "drop7"
- type: "Dropout"
- bottom: "fc7"
- top: "fc7"
- dropout_param {
- dropout_ratio: 0.5
- }
- }
- layer {
- name: "fc8"
- type: "InnerProduct"
- bottom: "fc7"
- top: "fc8"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- inner_product_param {
- num_output: 1000
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0
- }
- }
- }
- layer {
- name: "loss"
- type: "SoftmaxWithLoss"
- bottom: "fc8"
- bottom: "label"
- top: "loss"
- }
- I0803 12:10:38.288094 2175 layer_factory.hpp:77] Creating layer data
- I0803 12:10:38.463691 2175 db_lmdb.cpp:35] Opened lmdb examples/Mydataset_train_lmdb
- I0803 12:10:38.506412 2175 net.cpp:86] Creating Layer data
- I0803 12:10:38.506475 2175 net.cpp:382] data -> data
- I0803 12:10:38.506530 2175 net.cpp:382] data -> label
- I0803 12:10:38.506580 2175 data_transformer.cpp:25] Loading mean file from: examples/Mydataset_train_lmdb/mean_imagetest.binaryproto
- I0803 12:10:38.548476 2175 data_layer.cpp:45] output data size: 10,3,227,227
- I0803 12:10:38.560668 2175 net.cpp:124] Setting up data
- I0803 12:10:38.560693 2175 net.cpp:131] Top shape: 10 3 227 227 (1545870)
- I0803 12:10:38.560699 2175 net.cpp:131] Top shape: 10 (10)
- I0803 12:10:38.560703 2175 net.cpp:139] Memory required for data: 6183520
- I0803 12:10:38.560712 2175 layer_factory.hpp:77] Creating layer conv1
- I0803 12:10:38.560735 2175 net.cpp:86] Creating Layer conv1
- I0803 12:10:38.560742 2175 net.cpp:408] conv1 <- data
- I0803 12:10:38.560756 2175 net.cpp:382] conv1 -> conv1
- I0803 12:10:40.714188 2175 net.cpp:124] Setting up conv1
- I0803 12:10:40.714244 2175 net.cpp:131] Top shape: 10 64 227 227 (32978560)
- I0803 12:10:40.714278 2175 net.cpp:139] Memory required for data: 138097760
- I0803 12:10:40.714332 2175 layer_factory.hpp:77] Creating layer bn1
- I0803 12:10:40.714362 2175 net.cpp:86] Creating Layer bn1
- I0803 12:10:40.714376 2175 net.cpp:408] bn1 <- conv1
- I0803 12:10:40.714392 2175 net.cpp:369] bn1 -> conv1 (in-place)
- I0803 12:10:40.715009 2175 net.cpp:124] Setting up bn1
- I0803 12:10:40.715030 2175 net.cpp:131] Top shape: 10 64 227 227 (32978560)
- I0803 12:10:40.715041 2175 net.cpp:139] Memory required for data: 270012000
- I0803 12:10:40.715067 2175 layer_factory.hpp:77] Creating layer scale1
- I0803 12:10:40.715128 2175 net.cpp:86] Creating Layer scale1
- I0803 12:10:40.715138 2175 net.cpp:408] scale1 <- conv1
- I0803 12:10:40.715153 2175 net.cpp:369] scale1 -> conv1 (in-place)
- I0803 12:10:40.715266 2175 layer_factory.hpp:77] Creating layer scale1
- I0803 12:10:40.715734 2175 net.cpp:124] Setting up scale1
- I0803 12:10:40.715755 2175 net.cpp:131] Top shape: 10 64 227 227 (32978560)
- I0803 12:10:40.715765 2175 net.cpp:139] Memory required for data: 401926240
- I0803 12:10:40.715781 2175 layer_factory.hpp:77] Creating layer relu1
- I0803 12:10:40.715798 2175 net.cpp:86] Creating Layer relu1
- I0803 12:10:40.715809 2175 net.cpp:408] relu1 <- conv1
- I0803 12:10:40.715822 2175 net.cpp:369] relu1 -> conv1 (in-place)
- I0803 12:10:40.716305 2175 net.cpp:124] Setting up relu1
- I0803 12:10:40.716326 2175 net.cpp:131] Top shape: 10 64 227 227 (32978560)
- I0803 12:10:40.716336 2175 net.cpp:139] Memory required for data: 533840480
- I0803 12:10:40.716346 2175 layer_factory.hpp:77] Creating layer norm1
- I0803 12:10:40.716363 2175 net.cpp:86] Creating Layer norm1
- I0803 12:10:40.716374 2175 net.cpp:408] norm1 <- conv1
- I0803 12:10:40.716389 2175 net.cpp:382] norm1 -> norm1
- I0803 12:10:40.717486 2175 net.cpp:124] Setting up norm1
- I0803 12:10:40.717516 2175 net.cpp:131] Top shape: 10 64 227 227 (32978560)
- I0803 12:10:40.717525 2175 net.cpp:139] Memory required for data: 665754720
- I0803 12:10:40.717535 2175 layer_factory.hpp:77] Creating layer pool1
- I0803 12:10:40.717552 2175 net.cpp:86] Creating Layer pool1
- I0803 12:10:40.717563 2175 net.cpp:408] pool1 <- norm1
- I0803 12:10:40.717581 2175 net.cpp:382] pool1 -> pool1
- I0803 12:10:40.717699 2175 net.cpp:124] Setting up pool1
- I0803 12:10:40.717720 2175 net.cpp:131] Top shape: 10 64 113 113 (8172160)
- I0803 12:10:40.717730 2175 net.cpp:139] Memory required for data: 698443360
- I0803 12:10:40.717738 2175 layer_factory.hpp:77] Creating layer conv2
- I0803 12:10:40.717762 2175 net.cpp:86] Creating Layer conv2
- I0803 12:10:40.717774 2175 net.cpp:408] conv2 <- pool1
- I0803 12:10:40.717787 2175 net.cpp:382] conv2 -> conv2
- I0803 12:10:40.733384 2175 net.cpp:124] Setting up conv2
- I0803 12:10:40.733410 2175 net.cpp:131] Top shape: 10 256 113 113 (32688640)
- I0803 12:10:40.733415 2175 net.cpp:139] Memory required for data: 829197920
- I0803 12:10:40.733430 2175 layer_factory.hpp:77] Creating layer relu2
- I0803 12:10:40.733443 2175 net.cpp:86] Creating Layer relu2
- I0803 12:10:40.733448 2175 net.cpp:408] relu2 <- conv2
- I0803 12:10:40.733456 2175 net.cpp:369] relu2 -> conv2 (in-place)
- I0803 12:10:40.733969 2175 net.cpp:124] Setting up relu2
- I0803 12:10:40.733983 2175 net.cpp:131] Top shape: 10 256 113 113 (32688640)
- I0803 12:10:40.733985 2175 net.cpp:139] Memory required for data: 959952480
- I0803 12:10:40.733990 2175 layer_factory.hpp:77] Creating layer norm2
- I0803 12:10:40.733999 2175 net.cpp:86] Creating Layer norm2
- I0803 12:10:40.734002 2175 net.cpp:408] norm2 <- conv2
- I0803 12:10:40.734007 2175 net.cpp:382] norm2 -> norm2
- I0803 12:10:40.734232 2175 net.cpp:124] Setting up norm2
- I0803 12:10:40.734241 2175 net.cpp:131] Top shape: 10 256 113 113 (32688640)
- I0803 12:10:40.734244 2175 net.cpp:139] Memory required for data: 1090707040
- I0803 12:10:40.734248 2175 layer_factory.hpp:77] Creating layer pool2
- I0803 12:10:40.734269 2175 net.cpp:86] Creating Layer pool2
- I0803 12:10:40.734274 2175 net.cpp:408] pool2 <- norm2
- I0803 12:10:40.734280 2175 net.cpp:382] pool2 -> pool2
- I0803 12:10:40.734321 2175 net.cpp:124] Setting up pool2
- I0803 12:10:40.734329 2175 net.cpp:131] Top shape: 10 256 56 56 (8028160)
- I0803 12:10:40.734333 2175 net.cpp:139] Memory required for data: 1122819680
- I0803 12:10:40.734335 2175 layer_factory.hpp:77] Creating layer conv3
- I0803 12:10:40.734345 2175 net.cpp:86] Creating Layer conv3
- I0803 12:10:40.734349 2175 net.cpp:408] conv3 <- pool2
- I0803 12:10:40.734355 2175 net.cpp:382] conv3 -> conv3
- I0803 12:10:40.748950 2175 net.cpp:124] Setting up conv3
- I0803 12:10:40.748970 2175 net.cpp:131] Top shape: 10 384 56 56 (12042240)
- I0803 12:10:40.748973 2175 net.cpp:139] Memory required for data: 1170988640
- I0803 12:10:40.749014 2175 layer_factory.hpp:77] Creating layer relu3
- I0803 12:10:40.749023 2175 net.cpp:86] Creating Layer relu3
- I0803 12:10:40.749027 2175 net.cpp:408] relu3 <- conv3
- I0803 12:10:40.749034 2175 net.cpp:369] relu3 -> conv3 (in-place)
- I0803 12:10:40.749212 2175 net.cpp:124] Setting up relu3
- I0803 12:10:40.749219 2175 net.cpp:131] Top shape: 10 384 56 56 (12042240)
- I0803 12:10:40.749222 2175 net.cpp:139] Memory required for data: 1219157600
- I0803 12:10:40.749224 2175 layer_factory.hpp:77] Creating layer conv4
- I0803 12:10:40.749233 2175 net.cpp:86] Creating Layer conv4
- I0803 12:10:40.749235 2175 net.cpp:408] conv4 <- conv3
- I0803 12:10:40.749240 2175 net.cpp:382] conv4 -> conv4
- I0803 12:10:40.756110 2175 net.cpp:124] Setting up conv4
- I0803 12:10:40.756124 2175 net.cpp:131] Top shape: 10 384 56 56 (12042240)
- I0803 12:10:40.756127 2175 net.cpp:139] Memory required for data: 1267326560
- I0803 12:10:40.756134 2175 layer_factory.hpp:77] Creating layer relu4
- I0803 12:10:40.756139 2175 net.cpp:86] Creating Layer relu4
- I0803 12:10:40.756141 2175 net.cpp:408] relu4 <- conv4
- I0803 12:10:40.756145 2175 net.cpp:369] relu4 -> conv4 (in-place)
- I0803 12:10:40.756290 2175 net.cpp:124] Setting up relu4
- I0803 12:10:40.756297 2175 net.cpp:131] Top shape: 10 384 56 56 (12042240)
- I0803 12:10:40.756299 2175 net.cpp:139] Memory required for data: 1315495520
- I0803 12:10:40.756301 2175 layer_factory.hpp:77] Creating layer conv5
- I0803 12:10:40.756307 2175 net.cpp:86] Creating Layer conv5
- I0803 12:10:40.756310 2175 net.cpp:408] conv5 <- conv4
- I0803 12:10:40.756314 2175 net.cpp:382] conv5 -> conv5
- I0803 12:10:40.761378 2175 net.cpp:124] Setting up conv5
- I0803 12:10:40.761390 2175 net.cpp:131] Top shape: 10 256 56 56 (8028160)
- I0803 12:10:40.761394 2175 net.cpp:139] Memory required for data: 1347608160
- I0803 12:10:40.761399 2175 layer_factory.hpp:77] Creating layer relu5
- I0803 12:10:40.761406 2175 net.cpp:86] Creating Layer relu5
- I0803 12:10:40.761409 2175 net.cpp:408] relu5 <- conv5
- I0803 12:10:40.761414 2175 net.cpp:369] relu5 -> conv5 (in-place)
- I0803 12:10:40.761597 2175 net.cpp:124] Setting up relu5
- I0803 12:10:40.761603 2175 net.cpp:131] Top shape: 10 256 56 56 (8028160)
- I0803 12:10:40.761605 2175 net.cpp:139] Memory required for data: 1379720800
- I0803 12:10:40.761607 2175 layer_factory.hpp:77] Creating layer pool5
- I0803 12:10:40.761612 2175 net.cpp:86] Creating Layer pool5
- I0803 12:10:40.761615 2175 net.cpp:408] pool5 <- conv5
- I0803 12:10:40.761618 2175 net.cpp:382] pool5 -> pool5
- I0803 12:10:40.761677 2175 net.cpp:124] Setting up pool5
- I0803 12:10:40.761682 2175 net.cpp:131] Top shape: 10 256 28 28 (2007040)
- I0803 12:10:40.761698 2175 net.cpp:139] Memory required for data: 1387748960
- I0803 12:10:40.761700 2175 layer_factory.hpp:77] Creating layer fc6
- I0803 12:10:40.761706 2175 net.cpp:86] Creating Layer fc6
- I0803 12:10:40.761709 2175 net.cpp:408] fc6 <- pool5
- I0803 12:10:40.761713 2175 net.cpp:382] fc6 -> fc6
- I0803 12:10:47.059787 2175 net.cpp:124] Setting up fc6
- I0803 12:10:47.059804 2175 net.cpp:131] Top shape: 10 4096 (40960)
- I0803 12:10:47.059806 2175 net.cpp:139] Memory required for data: 1387912800
- I0803 12:10:47.059830 2175 layer_factory.hpp:77] Creating layer relu6
- I0803 12:10:47.059855 2175 net.cpp:86] Creating Layer relu6
- I0803 12:10:47.059859 2175 net.cpp:408] relu6 <- fc6
- I0803 12:10:47.059864 2175 net.cpp:369] relu6 -> fc6 (in-place)
- I0803 12:10:47.060097 2175 net.cpp:124] Setting up relu6
- I0803 12:10:47.060103 2175 net.cpp:131] Top shape: 10 4096 (40960)
- I0803 12:10:47.060106 2175 net.cpp:139] Memory required for data: 1388076640
- I0803 12:10:47.060108 2175 layer_factory.hpp:77] Creating layer drop6
- I0803 12:10:47.060132 2175 net.cpp:86] Creating Layer drop6
- I0803 12:10:47.060135 2175 net.cpp:408] drop6 <- fc6
- I0803 12:10:47.060139 2175 net.cpp:369] drop6 -> fc6 (in-place)
- I0803 12:10:47.060207 2175 net.cpp:124] Setting up drop6
- I0803 12:10:47.060211 2175 net.cpp:131] Top shape: 10 4096 (40960)
- I0803 12:10:47.060256 2175 net.cpp:139] Memory required for data: 1388240480
- I0803 12:10:47.060261 2175 layer_factory.hpp:77] Creating layer fc7
- I0803 12:10:47.060281 2175 net.cpp:86] Creating Layer fc7
- I0803 12:10:47.060283 2175 net.cpp:408] fc7 <- fc6
- I0803 12:10:47.060287 2175 net.cpp:382] fc7 -> fc7
- I0803 12:10:47.189757 2175 net.cpp:124] Setting up fc7
- I0803 12:10:47.189774 2175 net.cpp:131] Top shape: 10 4096 (40960)
- I0803 12:10:47.189776 2175 net.cpp:139] Memory required for data: 1388404320
- I0803 12:10:47.189797 2175 layer_factory.hpp:77] Creating layer relu7
- I0803 12:10:47.189805 2175 net.cpp:86] Creating Layer relu7
- I0803 12:10:47.189807 2175 net.cpp:408] relu7 <- fc7
- I0803 12:10:47.189826 2175 net.cpp:369] relu7 -> fc7 (in-place)
- I0803 12:10:47.190233 2175 net.cpp:124] Setting up relu7
- I0803 12:10:47.190240 2175 net.cpp:131] Top shape: 10 4096 (40960)
- I0803 12:10:47.190243 2175 net.cpp:139] Memory required for data: 1388568160
- I0803 12:10:47.190244 2175 layer_factory.hpp:77] Creating layer drop7
- I0803 12:10:47.190269 2175 net.cpp:86] Creating Layer drop7
- I0803 12:10:47.190274 2175 net.cpp:408] drop7 <- fc7
- I0803 12:10:47.190299 2175 net.cpp:369] drop7 -> fc7 (in-place)
- I0803 12:10:47.190349 2175 net.cpp:124] Setting up drop7
- I0803 12:10:47.190354 2175 net.cpp:131] Top shape: 10 4096 (40960)
- I0803 12:10:47.190356 2175 net.cpp:139] Memory required for data: 1388732000
- I0803 12:10:47.190358 2175 layer_factory.hpp:77] Creating layer fc8
- I0803 12:10:47.190362 2175 net.cpp:86] Creating Layer fc8
- I0803 12:10:47.190364 2175 net.cpp:408] fc8 <- fc7
- I0803 12:10:47.190369 2175 net.cpp:382] fc8 -> fc8
- I0803 12:10:47.222895 2175 net.cpp:124] Setting up fc8
- I0803 12:10:47.222913 2175 net.cpp:131] Top shape: 10 1000 (10000)
- I0803 12:10:47.222915 2175 net.cpp:139] Memory required for data: 1388772000
- I0803 12:10:47.222936 2175 layer_factory.hpp:77] Creating layer loss
- I0803 12:10:47.222942 2175 net.cpp:86] Creating Layer loss
- I0803 12:10:47.222946 2175 net.cpp:408] loss <- fc8
- I0803 12:10:47.222950 2175 net.cpp:408] loss <- label
- I0803 12:10:47.222970 2175 net.cpp:382] loss -> loss
- I0803 12:10:47.222978 2175 layer_factory.hpp:77] Creating layer loss
- I0803 12:10:47.223251 2175 net.cpp:124] Setting up loss
- I0803 12:10:47.223258 2175 net.cpp:131] Top shape: (1)
- I0803 12:10:47.223259 2175 net.cpp:134] with loss weight 1
- I0803 12:10:47.223287 2175 net.cpp:139] Memory required for data: 1388772004
- I0803 12:10:47.223291 2175 net.cpp:200] loss needs backward computation.
- I0803 12:10:47.223296 2175 net.cpp:200] fc8 needs backward computation.
- I0803 12:10:47.223299 2175 net.cpp:200] drop7 needs backward computation.
- I0803 12:10:47.223316 2175 net.cpp:200] relu7 needs backward computation.
- I0803 12:10:47.223320 2175 net.cpp:200] fc7 needs backward computation.
- I0803 12:10:47.223321 2175 net.cpp:200] drop6 needs backward computation.
- I0803 12:10:47.223341 2175 net.cpp:200] relu6 needs backward computation.
- I0803 12:10:47.223343 2175 net.cpp:200] fc6 needs backward computation.
- I0803 12:10:47.223345 2175 net.cpp:200] pool5 needs backward computation.
- I0803 12:10:47.223348 2175 net.cpp:200] relu5 needs backward computation.
- I0803 12:10:47.223363 2175 net.cpp:200] conv5 needs backward computation.
- I0803 12:10:47.223366 2175 net.cpp:200] relu4 needs backward computation.
- I0803 12:10:47.223368 2175 net.cpp:200] conv4 needs backward computation.
- I0803 12:10:47.223386 2175 net.cpp:200] relu3 needs backward computation.
- I0803 12:10:47.223389 2175 net.cpp:200] conv3 needs backward computation.
- I0803 12:10:47.223392 2175 net.cpp:200] pool2 needs backward computation.
- I0803 12:10:47.223395 2175 net.cpp:200] norm2 needs backward computation.
- I0803 12:10:47.223398 2175 net.cpp:200] relu2 needs backward computation.
- I0803 12:10:47.223402 2175 net.cpp:200] conv2 needs backward computation.
- I0803 12:10:47.223404 2175 net.cpp:200] pool1 needs backward computation.
- I0803 12:10:47.223407 2175 net.cpp:200] norm1 needs backward computation.
- I0803 12:10:47.223409 2175 net.cpp:200] relu1 needs backward computation.
- I0803 12:10:47.223428 2175 net.cpp:200] scale1 needs backward computation.
- I0803 12:10:47.223430 2175 net.cpp:200] bn1 needs backward computation.
- I0803 12:10:47.223433 2175 net.cpp:200] conv1 needs backward computation.
- I0803 12:10:47.223436 2175 net.cpp:202] data does not need backward computation.
- I0803 12:10:47.223439 2175 net.cpp:244] This network produces output loss
- I0803 12:10:47.223453 2175 net.cpp:257] Network initialization done.
- I0803 12:10:47.223675 2175 upgrade_proto.cpp:77] Attempting to upgrade batch norm layers using deprecated params: examples/alexnetfinetune/alexnetsade-bn.prototxt
- I0803 12:10:47.223681 2175 upgrade_proto.cpp:80] Successfully upgraded batch norm layers using deprecated params.
- I0803 12:10:47.223701 2175 solver.cpp:173] Creating test net (#0) specified by net file: examples/alexnetfinetune/alexnetsade-bn.prototxt
- I0803 12:10:47.223724 2175 net.cpp:296] The NetState phase (1) differed from the phase (0) specified by a rule in layer data
- I0803 12:10:47.223853 2175 net.cpp:53] Initializing net from parameters:
- name: "AlexNet"
- state {
- phase: TEST
- }
- layer {
- name: "data"
- type: "Data"
- top: "data"
- top: "label"
- include {
- phase: TEST
- }
- transform_param {
- mirror: false
- crop_size: 227
- mean_file: "examples/Mydataset_test_lmdb/mean_imagetest.binaryproto"
- }
- data_param {
- source: "examples/Mydataset_test_lmdb"
- batch_size: 10
- backend: LMDB
- }
- }
- layer {
- name: "conv1"
- type: "Convolution"
- bottom: "data"
- top: "conv1"
- param {
- lr_mult: 1
- }
- convolution_param {
- num_output: 64
- bias_term: true
- pad: 1
- kernel_size: 3
- stride: 1
- weight_filler {
- type: "xavier"
- }
- }
- }
- layer {
- name: "bn1"
- type: "BatchNorm"
- bottom: "conv1"
- top: "conv1"
- param {
- lr_mult: 0
- decay_mult: 0
- }
- param {
- lr_mult: 0
- decay_mult: 0
- }
- param {
- lr_mult: 0
- decay_mult: 0
- }
- }
- layer {
- name: "scale1"
- type: "Scale"
- bottom: "conv1"
- top: "conv1"
- scale_param {
- bias_term: true
- }
- }
- layer {
- name: "relu1"
- type: "ReLU"
- bottom: "conv1"
- top: "conv1"
- }
- layer {
- name: "norm1"
- type: "LRN"
- bottom: "conv1"
- top: "norm1"
- lrn_param {
- local_size: 5
- alpha: 0.0001
- beta: 0.75
- }
- }
- layer {
- name: "pool1"
- type: "Pooling"
- bottom: "norm1"
- top: "pool1"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 2
- }
- }
- layer {
- name: "conv2"
- type: "Convolution"
- bottom: "pool1"
- top: "conv2"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 256
- pad: 2
- kernel_size: 5
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu2"
- type: "ReLU"
- bottom: "conv2"
- top: "conv2"
- }
- layer {
- name: "norm2"
- type: "LRN"
- bottom: "conv2"
- top: "norm2"
- lrn_param {
- local_size: 5
- alpha: 0.0001
- beta: 0.75
- }
- }
- layer {
- name: "pool2"
- type: "Pooling"
- bottom: "norm2"
- top: "pool2"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 2
- }
- }
- layer {
- name: "conv3"
- type: "Convolution"
- bottom: "pool2"
- top: "conv3"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 384
- pad: 1
- kernel_size: 3
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0
- }
- }
- }
- layer {
- name: "relu3"
- type: "ReLU"
- bottom: "conv3"
- top: "conv3"
- }
- layer {
- name: "conv4"
- type: "Convolution"
- bottom: "conv3"
- top: "conv4"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 384
- pad: 1
- kernel_size: 3
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu4"
- type: "ReLU"
- bottom: "conv4"
- top: "conv4"
- }
- layer {
- name: "conv5"
- type: "Convolution"
- bottom: "conv4"
- top: "conv5"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 256
- pad: 1
- kernel_size: 3
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu5"
- type: "ReLU"
- bottom: "conv5"
- top: "conv5"
- }
- layer {
- name: "pool5"
- type: "Pooling"
- bottom: "conv5"
- top: "pool5"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 2
- }
- }
- layer {
- name: "fc6"
- type: "InnerProduct"
- bottom: "pool5"
- top: "fc6"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- inner_product_param {
- num_output: 4096
- weight_filler {
- type: "gaussian"
- std: 0.005
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu6"
- type: "ReLU"
- bottom: "fc6"
- top: "fc6"
- }
- layer {
- name: "drop6"
- type: "Dropout"
- bottom: "fc6"
- top: "fc6"
- dropout_param {
- dropout_ratio: 0.5
- }
- }
- layer {
- name: "fc7"
- type: "InnerProduct"
- bottom: "fc6"
- top: "fc7"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- inner_product_param {
- num_output: 4096
- weight_filler {
- type: "gaussian"
- std: 0.005
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu7"
- type: "ReLU"
- bottom: "fc7"
- top: "fc7"
- }
- layer {
- name: "drop7"
- type: "Dropout"
- bottom: "fc7"
- top: "fc7"
- dropout_param {
- dropout_ratio: 0.5
- }
- }
- layer {
- name: "fc8"
- type: "InnerProduct"
- bottom: "fc7"
- top: "fc8"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- inner_product_param {
- num_output: 1000
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0
- }
- }
- }
- layer {
- name: "accuracy"
- type: "Accuracy"
- bottom: "fc8"
- bottom: "label"
- top: "accuracy"
- include {
- phase: TEST
- }
- }
- layer {
- name: "loss"
- type: "SoftmaxWithLoss"
- bottom: "fc8"
- bottom: "label"
- top: "loss"
- }
- I0803 12:10:47.224076 2175 layer_factory.hpp:77] Creating layer data
- I0803 12:10:47.269927 2175 db_lmdb.cpp:35] Opened lmdb examples/Mydataset_test_lmdb
- I0803 12:10:47.280226 2175 net.cpp:86] Creating Layer data
- I0803 12:10:47.280279 2175 net.cpp:382] data -> data
- I0803 12:10:47.280303 2175 net.cpp:382] data -> label
- I0803 12:10:47.280324 2175 data_transformer.cpp:25] Loading mean file from: examples/Mydataset_test_lmdb/mean_imagetest.binaryproto
- I0803 12:10:47.313472 2175 data_layer.cpp:45] output data size: 10,3,227,227
- I0803 12:10:47.334664 2175 net.cpp:124] Setting up data
- I0803 12:10:47.334702 2175 net.cpp:131] Top shape: 10 3 227 227 (1545870)
- I0803 12:10:47.334712 2175 net.cpp:131] Top shape: 10 (10)
- I0803 12:10:47.334718 2175 net.cpp:139] Memory required for data: 6183520
- I0803 12:10:47.334731 2175 layer_factory.hpp:77] Creating layer label_data_1_split
- I0803 12:10:47.334753 2175 net.cpp:86] Creating Layer label_data_1_split
- I0803 12:10:47.334761 2175 net.cpp:408] label_data_1_split <- label
- I0803 12:10:47.334791 2175 net.cpp:382] label_data_1_split -> label_data_1_split_0
- I0803 12:10:47.334811 2175 net.cpp:382] label_data_1_split -> label_data_1_split_1
- I0803 12:10:47.335036 2175 net.cpp:124] Setting up label_data_1_split
- I0803 12:10:47.335067 2175 net.cpp:131] Top shape: 10 (10)
- I0803 12:10:47.335075 2175 net.cpp:131] Top shape: 10 (10)
- I0803 12:10:47.335080 2175 net.cpp:139] Memory required for data: 6183600
- I0803 12:10:47.335088 2175 layer_factory.hpp:77] Creating layer conv1
- I0803 12:10:47.335110 2175 net.cpp:86] Creating Layer conv1
- I0803 12:10:47.335119 2175 net.cpp:408] conv1 <- data
- I0803 12:10:47.335131 2175 net.cpp:382] conv1 -> conv1
- I0803 12:10:47.338120 2175 net.cpp:124] Setting up conv1
- I0803 12:10:47.338152 2175 net.cpp:131] Top shape: 10 64 227 227 (32978560)
- I0803 12:10:47.338197 2175 net.cpp:139] Memory required for data: 138097840
- I0803 12:10:47.338222 2175 layer_factory.hpp:77] Creating layer bn1
- I0803 12:10:47.338238 2175 net.cpp:86] Creating Layer bn1
- I0803 12:10:47.338245 2175 net.cpp:408] bn1 <- conv1
- I0803 12:10:47.338284 2175 net.cpp:369] bn1 -> conv1 (in-place)
- I0803 12:10:47.338843 2175 net.cpp:124] Setting up bn1
- I0803 12:10:47.338861 2175 net.cpp:131] Top shape: 10 64 227 227 (32978560)
- I0803 12:10:47.338870 2175 net.cpp:139] Memory required for data: 270012080
- I0803 12:10:47.338889 2175 layer_factory.hpp:77] Creating layer scale1
- I0803 12:10:47.338907 2175 net.cpp:86] Creating Layer scale1
- I0803 12:10:47.338914 2175 net.cpp:408] scale1 <- conv1
- I0803 12:10:47.338927 2175 net.cpp:369] scale1 -> conv1 (in-place)
- I0803 12:10:47.339010 2175 layer_factory.hpp:77] Creating layer scale1
- I0803 12:10:47.339423 2175 net.cpp:124] Setting up scale1
- I0803 12:10:47.339439 2175 net.cpp:131] Top shape: 10 64 227 227 (32978560)
- I0803 12:10:47.339448 2175 net.cpp:139] Memory required for data: 401926320
- I0803 12:10:47.339462 2175 layer_factory.hpp:77] Creating layer relu1
- I0803 12:10:47.339473 2175 net.cpp:86] Creating Layer relu1
- I0803 12:10:47.339485 2175 net.cpp:408] relu1 <- conv1
- I0803 12:10:47.339496 2175 net.cpp:369] relu1 -> conv1 (in-place)
- I0803 12:10:47.339871 2175 net.cpp:124] Setting up relu1
- I0803 12:10:47.339887 2175 net.cpp:131] Top shape: 10 64 227 227 (32978560)
- I0803 12:10:47.339893 2175 net.cpp:139] Memory required for data: 533840560
- I0803 12:10:47.339901 2175 layer_factory.hpp:77] Creating layer norm1
- I0803 12:10:47.339912 2175 net.cpp:86] Creating Layer norm1
- I0803 12:10:47.339921 2175 net.cpp:408] norm1 <- conv1
- I0803 12:10:47.339933 2175 net.cpp:382] norm1 -> norm1
- I0803 12:10:47.340322 2175 net.cpp:124] Setting up norm1
- I0803 12:10:47.340337 2175 net.cpp:131] Top shape: 10 64 227 227 (32978560)
- I0803 12:10:47.340343 2175 net.cpp:139] Memory required for data: 665754800
- I0803 12:10:47.340349 2175 layer_factory.hpp:77] Creating layer pool1
- I0803 12:10:47.340360 2175 net.cpp:86] Creating Layer pool1
- I0803 12:10:47.340366 2175 net.cpp:408] pool1 <- norm1
- I0803 12:10:47.340379 2175 net.cpp:382] pool1 -> pool1
- I0803 12:10:47.340472 2175 net.cpp:124] Setting up pool1
- I0803 12:10:47.340487 2175 net.cpp:131] Top shape: 10 64 113 113 (8172160)
- I0803 12:10:47.340492 2175 net.cpp:139] Memory required for data: 698443440
- I0803 12:10:47.340498 2175 layer_factory.hpp:77] Creating layer conv2
- I0803 12:10:47.340517 2175 net.cpp:86] Creating Layer conv2
- I0803 12:10:47.340523 2175 net.cpp:408] conv2 <- pool1
- I0803 12:10:47.340538 2175 net.cpp:382] conv2 -> conv2
- I0803 12:10:47.347864 2175 net.cpp:124] Setting up conv2
- I0803 12:10:47.347887 2175 net.cpp:131] Top shape: 10 256 113 113 (32688640)
- I0803 12:10:47.347894 2175 net.cpp:139] Memory required for data: 829198000
- I0803 12:10:47.347909 2175 layer_factory.hpp:77] Creating layer relu2
- I0803 12:10:47.347920 2175 net.cpp:86] Creating Layer relu2
- I0803 12:10:47.347924 2175 net.cpp:408] relu2 <- conv2
- I0803 12:10:47.347932 2175 net.cpp:369] relu2 -> conv2 (in-place)
- I0803 12:10:47.348479 2175 net.cpp:124] Setting up relu2
- I0803 12:10:47.348491 2175 net.cpp:131] Top shape: 10 256 113 113 (32688640)
- I0803 12:10:47.348496 2175 net.cpp:139] Memory required for data: 959952560
- I0803 12:10:47.348502 2175 layer_factory.hpp:77] Creating layer norm2
- I0803 12:10:47.348511 2175 net.cpp:86] Creating Layer norm2
- I0803 12:10:47.348516 2175 net.cpp:408] norm2 <- conv2
- I0803 12:10:47.348523 2175 net.cpp:382] norm2 -> norm2
- I0803 12:10:47.348768 2175 net.cpp:124] Setting up norm2
- I0803 12:10:47.348778 2175 net.cpp:131] Top shape: 10 256 113 113 (32688640)
- I0803 12:10:47.348783 2175 net.cpp:139] Memory required for data: 1090707120
- I0803 12:10:47.348786 2175 layer_factory.hpp:77] Creating layer pool2
- I0803 12:10:47.348793 2175 net.cpp:86] Creating Layer pool2
- I0803 12:10:47.348798 2175 net.cpp:408] pool2 <- norm2
- I0803 12:10:47.348803 2175 net.cpp:382] pool2 -> pool2
- I0803 12:10:47.348870 2175 net.cpp:124] Setting up pool2
- I0803 12:10:47.348878 2175 net.cpp:131] Top shape: 10 256 56 56 (8028160)
- I0803 12:10:47.348881 2175 net.cpp:139] Memory required for data: 1122819760
- I0803 12:10:47.348886 2175 layer_factory.hpp:77] Creating layer conv3
- I0803 12:10:47.348894 2175 net.cpp:86] Creating Layer conv3
- I0803 12:10:47.348899 2175 net.cpp:408] conv3 <- pool2
- I0803 12:10:47.348906 2175 net.cpp:382] conv3 -> conv3
- I0803 12:10:47.360764 2175 net.cpp:124] Setting up conv3
- I0803 12:10:47.360780 2175 net.cpp:131] Top shape: 10 384 56 56 (12042240)
- I0803 12:10:47.360783 2175 net.cpp:139] Memory required for data: 1170988720
- I0803 12:10:47.360791 2175 layer_factory.hpp:77] Creating layer relu3
- I0803 12:10:47.360798 2175 net.cpp:86] Creating Layer relu3
- I0803 12:10:47.360801 2175 net.cpp:408] relu3 <- conv3
- I0803 12:10:47.360808 2175 net.cpp:369] relu3 -> conv3 (in-place)
- I0803 12:10:47.360967 2175 net.cpp:124] Setting up relu3
- I0803 12:10:47.360975 2175 net.cpp:131] Top shape: 10 384 56 56 (12042240)
- I0803 12:10:47.360977 2175 net.cpp:139] Memory required for data: 1219157680
- I0803 12:10:47.360980 2175 layer_factory.hpp:77] Creating layer conv4
- I0803 12:10:47.360988 2175 net.cpp:86] Creating Layer conv4
- I0803 12:10:47.360991 2175 net.cpp:408] conv4 <- conv3
- I0803 12:10:47.360996 2175 net.cpp:382] conv4 -> conv4
- I0803 12:10:47.368546 2175 net.cpp:124] Setting up conv4
- I0803 12:10:47.368559 2175 net.cpp:131] Top shape: 10 384 56 56 (12042240)
- I0803 12:10:47.368561 2175 net.cpp:139] Memory required for data: 1267326640
- I0803 12:10:47.368582 2175 layer_factory.hpp:77] Creating layer relu4
- I0803 12:10:47.368605 2175 net.cpp:86] Creating Layer relu4
- I0803 12:10:47.368609 2175 net.cpp:408] relu4 <- conv4
- I0803 12:10:47.368614 2175 net.cpp:369] relu4 -> conv4 (in-place)
- I0803 12:10:47.368779 2175 net.cpp:124] Setting up relu4
- I0803 12:10:47.368785 2175 net.cpp:131] Top shape: 10 384 56 56 (12042240)
- I0803 12:10:47.368801 2175 net.cpp:139] Memory required for data: 1315495600
- I0803 12:10:47.368803 2175 layer_factory.hpp:77] Creating layer conv5
- I0803 12:10:47.368827 2175 net.cpp:86] Creating Layer conv5
- I0803 12:10:47.368831 2175 net.cpp:408] conv5 <- conv4
- I0803 12:10:47.368835 2175 net.cpp:382] conv5 -> conv5
- I0803 12:10:47.375073 2175 net.cpp:124] Setting up conv5
- I0803 12:10:47.375088 2175 net.cpp:131] Top shape: 10 256 56 56 (8028160)
- I0803 12:10:47.375092 2175 net.cpp:139] Memory required for data: 1347608240
- I0803 12:10:47.375113 2175 layer_factory.hpp:77] Creating layer relu5
- I0803 12:10:47.375118 2175 net.cpp:86] Creating Layer relu5
- I0803 12:10:47.375135 2175 net.cpp:408] relu5 <- conv5
- I0803 12:10:47.375141 2175 net.cpp:369] relu5 -> conv5 (in-place)
- I0803 12:10:47.375282 2175 net.cpp:124] Setting up relu5
- I0803 12:10:47.375288 2175 net.cpp:131] Top shape: 10 256 56 56 (8028160)
- I0803 12:10:47.375289 2175 net.cpp:139] Memory required for data: 1379720880
- I0803 12:10:47.375291 2175 layer_factory.hpp:77] Creating layer pool5
- I0803 12:10:47.375310 2175 net.cpp:86] Creating Layer pool5
- I0803 12:10:47.375313 2175 net.cpp:408] pool5 <- conv5
- I0803 12:10:47.375331 2175 net.cpp:382] pool5 -> pool5
- I0803 12:10:47.375396 2175 net.cpp:124] Setting up pool5
- I0803 12:10:47.375399 2175 net.cpp:131] Top shape: 10 256 28 28 (2007040)
- I0803 12:10:47.375401 2175 net.cpp:139] Memory required for data: 1387749040
- I0803 12:10:47.375403 2175 layer_factory.hpp:77] Creating layer fc6
- I0803 12:10:47.375408 2175 net.cpp:86] Creating Layer fc6
- I0803 12:10:47.375411 2175 net.cpp:408] fc6 <- pool5
- I0803 12:10:47.375414 2175 net.cpp:382] fc6 -> fc6
- I0803 12:10:53.939147 2175 net.cpp:124] Setting up fc6
- I0803 12:10:53.970624 2175 net.cpp:131] Top shape: 10 4096 (40960)
- I0803 12:10:53.970654 2175 net.cpp:139] Memory required for data: 1387912880
- I0803 12:10:54.020473 2175 layer_factory.hpp:77] Creating layer relu6
- I0803 12:10:54.029845 2175 net.cpp:86] Creating Layer relu6
- I0803 12:10:54.029889 2175 net.cpp:408] relu6 <- fc6
- I0803 12:10:54.029919 2175 net.cpp:369] relu6 -> fc6 (in-place)
- I0803 12:10:54.952005 2175 net.cpp:124] Setting up relu6
- I0803 12:10:54.952059 2175 net.cpp:131] Top shape: 10 4096 (40960)
- I0803 12:10:54.952069 2175 net.cpp:139] Memory required for data: 1388076720
- I0803 12:10:54.952081 2175 layer_factory.hpp:77] Creating layer drop6
- I0803 12:10:54.952183 2175 net.cpp:86] Creating Layer drop6
- I0803 12:10:54.952199 2175 net.cpp:408] drop6 <- fc6
- I0803 12:10:54.952221 2175 net.cpp:369] drop6 -> fc6 (in-place)
- I0803 12:10:54.952363 2175 net.cpp:124] Setting up drop6
- I0803 12:10:54.952380 2175 net.cpp:131] Top shape: 10 4096 (40960)
- I0803 12:10:54.952388 2175 net.cpp:139] Memory required for data: 1388240560
- I0803 12:10:54.952399 2175 layer_factory.hpp:77] Creating layer fc7
- I0803 12:10:54.952422 2175 net.cpp:86] Creating Layer fc7
- I0803 12:10:54.952433 2175 net.cpp:408] fc7 <- fc6
- I0803 12:10:54.952450 2175 net.cpp:382] fc7 -> fc7
- I0803 12:10:55.168879 2175 net.cpp:124] Setting up fc7
- I0803 12:10:55.168905 2175 net.cpp:131] Top shape: 10 4096 (40960)
- I0803 12:10:55.168921 2175 net.cpp:139] Memory required for data: 1388404400
- I0803 12:10:55.168929 2175 layer_factory.hpp:77] Creating layer relu7
- I0803 12:10:55.168952 2175 net.cpp:86] Creating Layer relu7
- I0803 12:10:55.168956 2175 net.cpp:408] relu7 <- fc7
- I0803 12:10:55.168962 2175 net.cpp:369] relu7 -> fc7 (in-place)
- I0803 12:10:55.213413 2175 net.cpp:124] Setting up relu7
- I0803 12:10:55.213459 2175 net.cpp:131] Top shape: 10 4096 (40960)
- I0803 12:10:55.213466 2175 net.cpp:139] Memory required for data: 1388568240
- I0803 12:10:55.213479 2175 layer_factory.hpp:77] Creating layer drop7
- I0803 12:10:55.213507 2175 net.cpp:86] Creating Layer drop7
- I0803 12:10:55.213522 2175 net.cpp:408] drop7 <- fc7
- I0803 12:10:55.213541 2175 net.cpp:369] drop7 -> fc7 (in-place)
- I0803 12:10:55.213641 2175 net.cpp:124] Setting up drop7
- I0803 12:10:55.213655 2175 net.cpp:131] Top shape: 10 4096 (40960)
- I0803 12:10:55.213662 2175 net.cpp:139] Memory required for data: 1388732080
- I0803 12:10:55.213670 2175 layer_factory.hpp:77] Creating layer fc8
- I0803 12:10:55.213688 2175 net.cpp:86] Creating Layer fc8
- I0803 12:10:55.213696 2175 net.cpp:408] fc8 <- fc7
- I0803 12:10:55.213712 2175 net.cpp:382] fc8 -> fc8
- I0803 12:10:55.271644 2175 net.cpp:124] Setting up fc8
- I0803 12:10:55.271661 2175 net.cpp:131] Top shape: 10 1000 (10000)
- I0803 12:10:55.271664 2175 net.cpp:139] Memory required for data: 1388772080
- I0803 12:10:55.271672 2175 layer_factory.hpp:77] Creating layer fc8_fc8_0_split
- I0803 12:10:55.322099 2175 net.cpp:86] Creating Layer fc8_fc8_0_split
- I0803 12:10:55.322120 2175 net.cpp:408] fc8_fc8_0_split <- fc8
- I0803 12:10:55.322129 2175 net.cpp:382] fc8_fc8_0_split -> fc8_fc8_0_split_0
- I0803 12:10:55.322139 2175 net.cpp:382] fc8_fc8_0_split -> fc8_fc8_0_split_1
- I0803 12:10:55.322221 2175 net.cpp:124] Setting up fc8_fc8_0_split
- I0803 12:10:55.322228 2175 net.cpp:131] Top shape: 10 1000 (10000)
- I0803 12:10:55.322232 2175 net.cpp:131] Top shape: 10 1000 (10000)
- I0803 12:10:55.322234 2175 net.cpp:139] Memory required for data: 1388852080
- I0803 12:10:55.322237 2175 layer_factory.hpp:77] Creating layer accuracy
- I0803 12:10:55.327911 2175 net.cpp:86] Creating Layer accuracy
- I0803 12:10:55.327934 2175 net.cpp:408] accuracy <- fc8_fc8_0_split_0
- I0803 12:10:55.327941 2175 net.cpp:408] accuracy <- label_data_1_split_0
- I0803 12:10:55.327951 2175 net.cpp:382] accuracy -> accuracy
- I0803 12:10:55.327967 2175 net.cpp:124] Setting up accuracy
- I0803 12:10:55.327975 2175 net.cpp:131] Top shape: (1)
- I0803 12:10:55.327978 2175 net.cpp:139] Memory required for data: 1388852084
- I0803 12:10:55.327983 2175 layer_factory.hpp:77] Creating layer loss
- I0803 12:10:55.336427 2175 net.cpp:86] Creating Layer loss
- I0803 12:10:55.336453 2175 net.cpp:408] loss <- fc8_fc8_0_split_1
- I0803 12:10:55.336462 2175 net.cpp:408] loss <- label_data_1_split_1
- I0803 12:10:55.336469 2175 net.cpp:382] loss -> loss
- I0803 12:10:55.341104 2175 layer_factory.hpp:77] Creating layer loss
- I0803 12:10:55.341733 2175 net.cpp:124] Setting up loss
- I0803 12:10:55.341776 2175 net.cpp:131] Top shape: (1)
- I0803 12:10:55.341780 2175 net.cpp:134] with loss weight 1
- I0803 12:10:55.353785 2175 net.cpp:139] Memory required for data: 1388852088
- I0803 12:10:55.353804 2175 net.cpp:200] loss needs backward computation.
- I0803 12:10:55.353821 2175 net.cpp:202] accuracy does not need backward computation.
- I0803 12:10:55.353832 2175 net.cpp:200] fc8_fc8_0_split needs backward computation.
- I0803 12:10:55.353842 2175 net.cpp:200] fc8 needs backward computation.
- I0803 12:10:55.353850 2175 net.cpp:200] drop7 needs backward computation.
- I0803 12:10:55.353859 2175 net.cpp:200] relu7 needs backward computation.
- I0803 12:10:55.353868 2175 net.cpp:200] fc7 needs backward computation.
- I0803 12:10:55.353873 2175 net.cpp:200] drop6 needs backward computation.
- I0803 12:10:55.353881 2175 net.cpp:200] relu6 needs backward computation.
- I0803 12:10:55.353888 2175 net.cpp:200] fc6 needs backward computation.
- I0803 12:10:55.353895 2175 net.cpp:200] pool5 needs backward computation.
- I0803 12:10:55.353904 2175 net.cpp:200] relu5 needs backward computation.
- I0803 12:10:55.353911 2175 net.cpp:200] conv5 needs backward computation.
- I0803 12:10:55.353920 2175 net.cpp:200] relu4 needs backward computation.
- I0803 12:10:55.353927 2175 net.cpp:200] conv4 needs backward computation.
- I0803 12:10:55.353935 2175 net.cpp:200] relu3 needs backward computation.
- I0803 12:10:55.353943 2175 net.cpp:200] conv3 needs backward computation.
- I0803 12:10:55.353955 2175 net.cpp:200] pool2 needs backward computation.
- I0803 12:10:55.353962 2175 net.cpp:200] norm2 needs backward computation.
- I0803 12:10:55.353971 2175 net.cpp:200] relu2 needs backward computation.
- I0803 12:10:55.353978 2175 net.cpp:200] conv2 needs backward computation.
- I0803 12:10:55.353988 2175 net.cpp:200] pool1 needs backward computation.
- I0803 12:10:55.354003 2175 net.cpp:200] norm1 needs backward computation.
- I0803 12:10:55.354012 2175 net.cpp:200] relu1 needs backward computation.
- I0803 12:10:55.354019 2175 net.cpp:200] scale1 needs backward computation.
- I0803 12:10:55.354027 2175 net.cpp:200] bn1 needs backward computation.
- I0803 12:10:55.354035 2175 net.cpp:200] conv1 needs backward computation.
- I0803 12:10:55.354045 2175 net.cpp:202] label_data_1_split does not need backward computation.
- I0803 12:10:55.354055 2175 net.cpp:202] data does not need backward computation.
- I0803 12:10:55.354063 2175 net.cpp:244] This network produces output accuracy
- I0803 12:10:55.354075 2175 net.cpp:244] This network produces output loss
- I0803 12:10:55.354123 2175 net.cpp:257] Network initialization done.
- I0803 12:10:55.385907 2175 solver.cpp:56] Solver scaffolding done.
- I0803 12:10:55.402374 2175 caffe.cpp:248] Starting Optimization
- I0803 12:10:55.402422 2175 solver.cpp:273] Solving AlexNet
- I0803 12:10:55.402431 2175 solver.cpp:274] Learning Rate Policy: step
- I0803 12:10:55.789188 2175 solver.cpp:331] Iteration 0, Testing net (#0)
- I0803 12:10:57.610282 2175 blocking_queue.cpp:49] Waiting for data
- I0803 12:11:36.445051 2228 data_layer.cpp:73] Restarting data prefetching from start.
- I0803 12:11:36.613025 2175 solver.cpp:398] Test net output #0: accuracy = 0
- I0803 12:11:36.613050 2175 solver.cpp:398] Test net output #1: loss = 6.80939 (* 1 = 6.80939 loss)
- F0803 12:11:36.613711 2175 syncedmem.cpp:71] Check failed: error == cudaSuccess (2 vs. 0) out of memory
- *** Check failure stack trace: ***
- @ 0x7f8d2b64d5cd google::LogMessage::Fail()
- @ 0x7f8d2b64f433 google::LogMessage::SendToLog()
- @ 0x7f8d2b64d15b google::LogMessage::Flush()
- @ 0x7f8d2b64fe1e google::LogMessageFatal::~LogMessageFatal()
- @ 0x7f8d2bc84ad8 caffe::SyncedMemory::mutable_gpu_data()
- @ 0x7f8d2bdf90f2 caffe::Blob<>::mutable_gpu_data()
- @ 0x7f8d2be2ce48 caffe::CuDNNConvolutionLayer<>::Forward_gpu()
- @ 0x7f8d2bc938b1 caffe::Net<>::ForwardFromTo()
- @ 0x7f8d2bc939b7 caffe::Net<>::Forward()
- @ 0x7f8d2bdef348 caffe::Solver<>::Step()
- @ 0x7f8d2bdefeea caffe::Solver<>::Solve()
- @ 0x40aeb4 train()
- @ 0x4075a8 main
- @ 0x7f8d29de4830 __libc_start_main
- @ 0x407e79 _start
- @ (nil) (unknown)
- Aborted (core dumped)
Advertisement
Add Comment
Please, Sign In to add comment