Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- sina@sina-Z170X-UD3:~/code/caffe$ ./build/tools/caffe train --solver=examples/alexnetfinetune/solver_sina.prototxt
- I0802 16:51:20.823194 2858 caffe.cpp:218] Using GPUs 0
- I0802 16:51:20.838037 2858 caffe.cpp:223] GPU 0: GeForce GTX 1070
- I0802 16:51:21.037343 2858 solver.cpp:44] Initializing solver from parameters:
- test_iter: 56
- test_interval: 28
- base_lr: 0.001
- display: 20
- max_iter: 5000
- lr_policy: "step"
- gamma: 0.1
- momentum: 0.9
- weight_decay: 0.0005
- stepsize: 100000
- snapshot: 4999
- snapshot_prefix: "models/bvlc_alexnet/caffe_alexnet_sinatrain"
- solver_mode: GPU
- device_id: 0
- net: "examples/alexnetfinetune/alexnetsade-bn.prototxt"
- train_state {
- level: 0
- stage: ""
- }
- type: "SGD"
- I0802 16:51:21.037497 2858 solver.cpp:87] Creating training net from net file: examples/alexnetfinetune/alexnetsade-bn.prototxt
- I0802 16:51:21.037703 2858 upgrade_proto.cpp:77] Attempting to upgrade batch norm layers using deprecated params: examples/alexnetfinetune/alexnetsade-bn.prototxt
- I0802 16:51:21.037711 2858 upgrade_proto.cpp:80] Successfully upgraded batch norm layers using deprecated params.
- I0802 16:51:21.037802 2858 net.cpp:296] The NetState phase (0) differed from the phase (1) specified by a rule in layer data
- I0802 16:51:21.037811 2858 net.cpp:296] The NetState phase (0) differed from the phase (1) specified by a rule in layer accuracy
- I0802 16:51:21.037987 2858 net.cpp:53] Initializing net from parameters:
- name: "AlexNet"
- state {
- phase: TRAIN
- level: 0
- stage: ""
- }
- layer {
- name: "data"
- type: "Data"
- top: "data"
- top: "label"
- include {
- phase: TRAIN
- }
- transform_param {
- mirror: true
- crop_size: 227
- mean_file: "examples/Mydataset_train_lmdb/mean_imagetest.binaryproto"
- }
- data_param {
- source: "examples/Mydataset_train_lmdb"
- batch_size: 256
- backend: LMDB
- }
- }
- layer {
- name: "conv1"
- type: "Convolution"
- bottom: "data"
- top: "conv1"
- param {
- lr_mult: 1
- }
- convolution_param {
- num_output: 64
- bias_term: true
- pad: 1
- kernel_size: 3
- stride: 1
- weight_filler {
- type: "xavier"
- }
- }
- }
- layer {
- name: "bn1"
- type: "BatchNorm"
- bottom: "conv1"
- top: "conv1"
- param {
- lr_mult: 0
- decay_mult: 0
- }
- param {
- lr_mult: 0
- decay_mult: 0
- }
- param {
- lr_mult: 0
- decay_mult: 0
- }
- }
- layer {
- name: "scale1"
- type: "Scale"
- bottom: "conv1"
- top: "conv1"
- scale_param {
- bias_term: true
- }
- }
- layer {
- name: "relu1"
- type: "ReLU"
- bottom: "conv1"
- top: "conv1"
- }
- layer {
- name: "norm1"
- type: "LRN"
- bottom: "conv1"
- top: "norm1"
- lrn_param {
- local_size: 5
- alpha: 0.0001
- beta: 0.75
- }
- }
- layer {
- name: "pool1"
- type: "Pooling"
- bottom: "norm1"
- top: "pool1"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 2
- }
- }
- layer {
- name: "conv2"
- type: "Convolution"
- bottom: "pool1"
- top: "conv2"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 256
- pad: 2
- kernel_size: 5
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu2"
- type: "ReLU"
- bottom: "conv2"
- top: "conv2"
- }
- layer {
- name: "norm2"
- type: "LRN"
- bottom: "conv2"
- top: "norm2"
- lrn_param {
- local_size: 5
- alpha: 0.0001
- beta: 0.75
- }
- }
- layer {
- name: "pool2"
- type: "Pooling"
- bottom: "norm2"
- top: "pool2"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 2
- }
- }
- layer {
- name: "conv3"
- type: "Convolution"
- bottom: "pool2"
- top: "conv3"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 384
- pad: 1
- kernel_size: 3
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0
- }
- }
- }
- layer {
- name: "relu3"
- type: "ReLU"
- bottom: "conv3"
- top: "conv3"
- }
- layer {
- name: "conv4"
- type: "Convolution"
- bottom: "conv3"
- top: "conv4"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 384
- pad: 1
- kernel_size: 3
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu4"
- type: "ReLU"
- bottom: "conv4"
- top: "conv4"
- }
- layer {
- name: "conv5"
- type: "Convolution"
- bottom: "conv4"
- top: "conv5"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 256
- pad: 1
- kernel_size: 3
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu5"
- type: "ReLU"
- bottom: "conv5"
- top: "conv5"
- }
- layer {
- name: "pool5"
- type: "Pooling"
- bottom: "conv5"
- top: "pool5"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 2
- }
- }
- layer {
- name: "fc6"
- type: "InnerProduct"
- bottom: "pool5"
- top: "fc6"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- inner_product_param {
- num_output: 4096
- weight_filler {
- type: "gaussian"
- std: 0.005
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu6"
- type: "ReLU"
- bottom: "fc6"
- top: "fc6"
- }
- layer {
- name: "drop6"
- type: "Dropout"
- bottom: "fc6"
- top: "fc6"
- dropout_param {
- dropout_ratio: 0.5
- }
- }
- layer {
- name: "fc7"
- type: "InnerProduct"
- bottom: "fc6"
- top: "fc7"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- inner_product_param {
- num_output: 4096
- weight_filler {
- type: "gaussian"
- std: 0.005
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu7"
- type: "ReLU"
- bottom: "fc7"
- top: "fc7"
- }
- layer {
- name: "drop7"
- type: "Dropout"
- bottom: "fc7"
- top: "fc7"
- dropout_param {
- dropout_ratio: 0.5
- }
- }
- layer {
- name: "fc8"
- type: "InnerProduct"
- bottom: "fc7"
- top: "fc8"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- inner_product_param {
- num_output: 1000
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0
- }
- }
- }
- layer {
- name: "loss"
- type: "SoftmaxWithLoss"
- bottom: "fc8"
- bottom: "label"
- top: "loss"
- }
- I0802 16:51:21.052150 2858 layer_factory.hpp:77] Creating layer data
- I0802 16:51:21.061594 2858 db_lmdb.cpp:35] Opened lmdb examples/Mydataset_train_lmdb
- I0802 16:51:21.091151 2858 net.cpp:86] Creating Layer data
- I0802 16:51:21.091212 2858 net.cpp:382] data -> data
- I0802 16:51:21.091265 2858 net.cpp:382] data -> label
- I0802 16:51:21.091301 2858 data_transformer.cpp:25] Loading mean file from: examples/Mydataset_train_lmdb/mean_imagetest.binaryproto
- I0802 16:51:21.125758 2858 data_layer.cpp:45] output data size: 256,3,227,227
- I0802 16:51:21.312355 2858 net.cpp:124] Setting up data
- I0802 16:51:21.312403 2858 net.cpp:131] Top shape: 256 3 227 227 (39574272)
- I0802 16:51:21.312408 2858 net.cpp:131] Top shape: 256 (256)
- I0802 16:51:21.312433 2858 net.cpp:139] Memory required for data: 158298112
- I0802 16:51:21.312441 2858 layer_factory.hpp:77] Creating layer conv1
- I0802 16:51:21.312459 2858 net.cpp:86] Creating Layer conv1
- I0802 16:51:21.312463 2858 net.cpp:408] conv1 <- data
- I0802 16:51:21.312500 2858 net.cpp:382] conv1 -> conv1
- I0802 16:51:25.388608 2858 net.cpp:124] Setting up conv1
- I0802 16:51:25.388659 2858 net.cpp:131] Top shape: 256 64 227 227 (844251136)
- I0802 16:51:25.388669 2858 net.cpp:139] Memory required for data: 3535302656
- I0802 16:51:25.388715 2858 layer_factory.hpp:77] Creating layer bn1
- I0802 16:51:25.388741 2858 net.cpp:86] Creating Layer bn1
- I0802 16:51:25.388752 2858 net.cpp:408] bn1 <- conv1
- I0802 16:51:25.388772 2858 net.cpp:369] bn1 -> conv1 (in-place)
- I0802 16:51:25.389351 2858 net.cpp:124] Setting up bn1
- I0802 16:51:25.389370 2858 net.cpp:131] Top shape: 256 64 227 227 (844251136)
- I0802 16:51:25.389380 2858 net.cpp:139] Memory required for data: 6912307200
- I0802 16:51:25.389405 2858 layer_factory.hpp:77] Creating layer scale1
- I0802 16:51:25.389463 2858 net.cpp:86] Creating Layer scale1
- I0802 16:51:25.389472 2858 net.cpp:408] scale1 <- conv1
- I0802 16:51:25.389487 2858 net.cpp:369] scale1 -> conv1 (in-place)
- I0802 16:51:25.389588 2858 layer_factory.hpp:77] Creating layer scale1
- I0802 16:51:25.390061 2858 net.cpp:124] Setting up scale1
- I0802 16:51:25.390081 2858 net.cpp:131] Top shape: 256 64 227 227 (844251136)
- I0802 16:51:25.390090 2858 net.cpp:139] Memory required for data: 10289311744
- I0802 16:51:25.390106 2858 layer_factory.hpp:77] Creating layer relu1
- I0802 16:51:25.390122 2858 net.cpp:86] Creating Layer relu1
- I0802 16:51:25.390132 2858 net.cpp:408] relu1 <- conv1
- I0802 16:51:25.390148 2858 net.cpp:369] relu1 -> conv1 (in-place)
- I0802 16:51:25.390626 2858 net.cpp:124] Setting up relu1
- I0802 16:51:25.390648 2858 net.cpp:131] Top shape: 256 64 227 227 (844251136)
- I0802 16:51:25.390660 2858 net.cpp:139] Memory required for data: 13666316288
- I0802 16:51:25.390671 2858 layer_factory.hpp:77] Creating layer norm1
- I0802 16:51:25.390696 2858 net.cpp:86] Creating Layer norm1
- I0802 16:51:25.390707 2858 net.cpp:408] norm1 <- conv1
- I0802 16:51:25.390722 2858 net.cpp:382] norm1 -> norm1
- I0802 16:51:25.391862 2858 net.cpp:124] Setting up norm1
- I0802 16:51:25.391892 2858 net.cpp:131] Top shape: 256 64 227 227 (844251136)
- I0802 16:51:25.391901 2858 net.cpp:139] Memory required for data: 17043320832
- I0802 16:51:25.391908 2858 layer_factory.hpp:77] Creating layer pool1
- I0802 16:51:25.391926 2858 net.cpp:86] Creating Layer pool1
- I0802 16:51:25.391937 2858 net.cpp:408] pool1 <- norm1
- I0802 16:51:25.391955 2858 net.cpp:382] pool1 -> pool1
- I0802 16:51:25.392066 2858 net.cpp:124] Setting up pool1
- I0802 16:51:25.392094 2858 net.cpp:131] Top shape: 256 64 113 113 (209207296)
- I0802 16:51:25.392102 2858 net.cpp:139] Memory required for data: 17880150016
- I0802 16:51:25.392109 2858 layer_factory.hpp:77] Creating layer conv2
- I0802 16:51:25.392133 2858 net.cpp:86] Creating Layer conv2
- I0802 16:51:25.392145 2858 net.cpp:408] conv2 <- pool1
- I0802 16:51:25.392159 2858 net.cpp:382] conv2 -> conv2
- I0802 16:51:25.425178 2858 net.cpp:124] Setting up conv2
- I0802 16:51:25.425228 2858 net.cpp:131] Top shape: 256 256 113 113 (836829184)
- I0802 16:51:25.425236 2858 net.cpp:139] Memory required for data: 21227466752
- I0802 16:51:25.425261 2858 layer_factory.hpp:77] Creating layer relu2
- I0802 16:51:25.425282 2858 net.cpp:86] Creating Layer relu2
- I0802 16:51:25.425290 2858 net.cpp:408] relu2 <- conv2
- I0802 16:51:25.425305 2858 net.cpp:369] relu2 -> conv2 (in-place)
- I0802 16:51:25.426239 2858 net.cpp:124] Setting up relu2
- I0802 16:51:25.426262 2858 net.cpp:131] Top shape: 256 256 113 113 (836829184)
- I0802 16:51:25.426268 2858 net.cpp:139] Memory required for data: 24574783488
- I0802 16:51:25.426275 2858 layer_factory.hpp:77] Creating layer norm2
- I0802 16:51:25.426290 2858 net.cpp:86] Creating Layer norm2
- I0802 16:51:25.426297 2858 net.cpp:408] norm2 <- conv2
- I0802 16:51:25.426311 2858 net.cpp:382] norm2 -> norm2
- I0802 16:51:25.426684 2858 net.cpp:124] Setting up norm2
- I0802 16:51:25.426702 2858 net.cpp:131] Top shape: 256 256 113 113 (836829184)
- I0802 16:51:25.426707 2858 net.cpp:139] Memory required for data: 27922100224
- I0802 16:51:25.426712 2858 layer_factory.hpp:77] Creating layer pool2
- I0802 16:51:25.426723 2858 net.cpp:86] Creating Layer pool2
- I0802 16:51:25.426730 2858 net.cpp:408] pool2 <- norm2
- I0802 16:51:25.426743 2858 net.cpp:382] pool2 -> pool2
- I0802 16:51:25.426808 2858 net.cpp:124] Setting up pool2
- I0802 16:51:25.426821 2858 net.cpp:131] Top shape: 256 256 56 56 (205520896)
- I0802 16:51:25.426828 2858 net.cpp:139] Memory required for data: 28744183808
- I0802 16:51:25.426833 2858 layer_factory.hpp:77] Creating layer conv3
- I0802 16:51:25.426847 2858 net.cpp:86] Creating Layer conv3
- I0802 16:51:25.426856 2858 net.cpp:408] conv3 <- pool2
- I0802 16:51:25.426870 2858 net.cpp:382] conv3 -> conv3
- I0802 16:51:25.447731 2858 net.cpp:124] Setting up conv3
- I0802 16:51:25.447755 2858 net.cpp:131] Top shape: 256 384 56 56 (308281344)
- I0802 16:51:25.447782 2858 net.cpp:139] Memory required for data: 29977309184
- I0802 16:51:25.447793 2858 layer_factory.hpp:77] Creating layer relu3
- I0802 16:51:25.447803 2858 net.cpp:86] Creating Layer relu3
- I0802 16:51:25.447808 2858 net.cpp:408] relu3 <- conv3
- I0802 16:51:25.447816 2858 net.cpp:369] relu3 -> conv3 (in-place)
- I0802 16:51:25.448030 2858 net.cpp:124] Setting up relu3
- I0802 16:51:25.448040 2858 net.cpp:131] Top shape: 256 384 56 56 (308281344)
- I0802 16:51:25.448042 2858 net.cpp:139] Memory required for data: 31210434560
- I0802 16:51:25.448047 2858 layer_factory.hpp:77] Creating layer conv4
- I0802 16:51:25.448058 2858 net.cpp:86] Creating Layer conv4
- I0802 16:51:25.448061 2858 net.cpp:408] conv4 <- conv3
- I0802 16:51:25.448070 2858 net.cpp:382] conv4 -> conv4
- I0802 16:51:25.458153 2858 net.cpp:124] Setting up conv4
- I0802 16:51:25.458170 2858 net.cpp:131] Top shape: 256 384 56 56 (308281344)
- I0802 16:51:25.458173 2858 net.cpp:139] Memory required for data: 32443559936
- I0802 16:51:25.458181 2858 layer_factory.hpp:77] Creating layer relu4
- I0802 16:51:25.458189 2858 net.cpp:86] Creating Layer relu4
- I0802 16:51:25.458191 2858 net.cpp:408] relu4 <- conv4
- I0802 16:51:25.458195 2858 net.cpp:369] relu4 -> conv4 (in-place)
- I0802 16:51:25.458353 2858 net.cpp:124] Setting up relu4
- I0802 16:51:25.458360 2858 net.cpp:131] Top shape: 256 384 56 56 (308281344)
- I0802 16:51:25.458362 2858 net.cpp:139] Memory required for data: 33676685312
- I0802 16:51:25.458365 2858 layer_factory.hpp:77] Creating layer conv5
- I0802 16:51:25.458374 2858 net.cpp:86] Creating Layer conv5
- I0802 16:51:25.458377 2858 net.cpp:408] conv5 <- conv4
- I0802 16:51:25.458384 2858 net.cpp:382] conv5 -> conv5
- I0802 16:51:25.464239 2858 net.cpp:124] Setting up conv5
- I0802 16:51:25.464249 2858 net.cpp:131] Top shape: 256 256 56 56 (205520896)
- I0802 16:51:25.464252 2858 net.cpp:139] Memory required for data: 34498768896
- I0802 16:51:25.464257 2858 layer_factory.hpp:77] Creating layer relu5
- I0802 16:51:25.464265 2858 net.cpp:86] Creating Layer relu5
- I0802 16:51:25.464267 2858 net.cpp:408] relu5 <- conv5
- I0802 16:51:25.464272 2858 net.cpp:369] relu5 -> conv5 (in-place)
- I0802 16:51:25.464406 2858 net.cpp:124] Setting up relu5
- I0802 16:51:25.464411 2858 net.cpp:131] Top shape: 256 256 56 56 (205520896)
- I0802 16:51:25.464413 2858 net.cpp:139] Memory required for data: 35320852480
- I0802 16:51:25.464416 2858 layer_factory.hpp:77] Creating layer pool5
- I0802 16:51:25.464421 2858 net.cpp:86] Creating Layer pool5
- I0802 16:51:25.464422 2858 net.cpp:408] pool5 <- conv5
- I0802 16:51:25.464426 2858 net.cpp:382] pool5 -> pool5
- I0802 16:51:25.464458 2858 net.cpp:124] Setting up pool5
- I0802 16:51:25.464463 2858 net.cpp:131] Top shape: 256 256 28 28 (51380224)
- I0802 16:51:25.464465 2858 net.cpp:139] Memory required for data: 35526373376
- I0802 16:51:25.464468 2858 layer_factory.hpp:77] Creating layer fc6
- I0802 16:51:25.464473 2858 net.cpp:86] Creating Layer fc6
- I0802 16:51:25.464475 2858 net.cpp:408] fc6 <- pool5
- I0802 16:51:25.464479 2858 net.cpp:382] fc6 -> fc6
- I0802 16:51:31.718487 2858 net.cpp:124] Setting up fc6
- I0802 16:51:31.718506 2858 net.cpp:131] Top shape: 256 4096 (1048576)
- I0802 16:51:31.718508 2858 net.cpp:139] Memory required for data: 35530567680
- I0802 16:51:31.718533 2858 layer_factory.hpp:77] Creating layer relu6
- I0802 16:51:31.718540 2858 net.cpp:86] Creating Layer relu6
- I0802 16:51:31.718544 2858 net.cpp:408] relu6 <- fc6
- I0802 16:51:31.718564 2858 net.cpp:369] relu6 -> fc6 (in-place)
- I0802 16:51:31.718791 2858 net.cpp:124] Setting up relu6
- I0802 16:51:31.718797 2858 net.cpp:131] Top shape: 256 4096 (1048576)
- I0802 16:51:31.718799 2858 net.cpp:139] Memory required for data: 35534761984
- I0802 16:51:31.718801 2858 layer_factory.hpp:77] Creating layer drop6
- I0802 16:51:31.718806 2858 net.cpp:86] Creating Layer drop6
- I0802 16:51:31.718807 2858 net.cpp:408] drop6 <- fc6
- I0802 16:51:31.718825 2858 net.cpp:369] drop6 -> fc6 (in-place)
- I0802 16:51:31.718860 2858 net.cpp:124] Setting up drop6
- I0802 16:51:31.718896 2858 net.cpp:131] Top shape: 256 4096 (1048576)
- I0802 16:51:31.718899 2858 net.cpp:139] Memory required for data: 35538956288
- I0802 16:51:31.718901 2858 layer_factory.hpp:77] Creating layer fc7
- I0802 16:51:31.718921 2858 net.cpp:86] Creating Layer fc7
- I0802 16:51:31.718924 2858 net.cpp:408] fc7 <- fc6
- I0802 16:51:31.718940 2858 net.cpp:382] fc7 -> fc7
- I0802 16:51:31.848628 2858 net.cpp:124] Setting up fc7
- I0802 16:51:31.848644 2858 net.cpp:131] Top shape: 256 4096 (1048576)
- I0802 16:51:31.848646 2858 net.cpp:139] Memory required for data: 35543150592
- I0802 16:51:31.848654 2858 layer_factory.hpp:77] Creating layer relu7
- I0802 16:51:31.848675 2858 net.cpp:86] Creating Layer relu7
- I0802 16:51:31.848678 2858 net.cpp:408] relu7 <- fc7
- I0802 16:51:31.848697 2858 net.cpp:369] relu7 -> fc7 (in-place)
- I0802 16:51:31.849135 2858 net.cpp:124] Setting up relu7
- I0802 16:51:31.849146 2858 net.cpp:131] Top shape: 256 4096 (1048576)
- I0802 16:51:31.849148 2858 net.cpp:139] Memory required for data: 35547344896
- I0802 16:51:31.849150 2858 layer_factory.hpp:77] Creating layer drop7
- I0802 16:51:31.849154 2858 net.cpp:86] Creating Layer drop7
- I0802 16:51:31.849171 2858 net.cpp:408] drop7 <- fc7
- I0802 16:51:31.849174 2858 net.cpp:369] drop7 -> fc7 (in-place)
- I0802 16:51:31.849221 2858 net.cpp:124] Setting up drop7
- I0802 16:51:31.849225 2858 net.cpp:131] Top shape: 256 4096 (1048576)
- I0802 16:51:31.849227 2858 net.cpp:139] Memory required for data: 35551539200
- I0802 16:51:31.849230 2858 layer_factory.hpp:77] Creating layer fc8
- I0802 16:51:31.849248 2858 net.cpp:86] Creating Layer fc8
- I0802 16:51:31.849251 2858 net.cpp:408] fc8 <- fc7
- I0802 16:51:31.849256 2858 net.cpp:382] fc8 -> fc8
- I0802 16:51:31.881816 2858 net.cpp:124] Setting up fc8
- I0802 16:51:31.881834 2858 net.cpp:131] Top shape: 256 1000 (256000)
- I0802 16:51:31.881835 2858 net.cpp:139] Memory required for data: 35552563200
- I0802 16:51:31.881857 2858 layer_factory.hpp:77] Creating layer loss
- I0802 16:51:31.881878 2858 net.cpp:86] Creating Layer loss
- I0802 16:51:31.881881 2858 net.cpp:408] loss <- fc8
- I0802 16:51:31.881898 2858 net.cpp:408] loss <- label
- I0802 16:51:31.881903 2858 net.cpp:382] loss -> loss
- I0802 16:51:31.881927 2858 layer_factory.hpp:77] Creating layer loss
- I0802 16:51:31.882881 2858 net.cpp:124] Setting up loss
- I0802 16:51:31.882890 2858 net.cpp:131] Top shape: (1)
- I0802 16:51:31.882894 2858 net.cpp:134] with loss weight 1
- I0802 16:51:31.882920 2858 net.cpp:139] Memory required for data: 35552563204
- I0802 16:51:31.882925 2858 net.cpp:200] loss needs backward computation.
- I0802 16:51:31.882930 2858 net.cpp:200] fc8 needs backward computation.
- I0802 16:51:31.882946 2858 net.cpp:200] drop7 needs backward computation.
- I0802 16:51:31.882949 2858 net.cpp:200] relu7 needs backward computation.
- I0802 16:51:31.882953 2858 net.cpp:200] fc7 needs backward computation.
- I0802 16:51:31.882971 2858 net.cpp:200] drop6 needs backward computation.
- I0802 16:51:31.882973 2858 net.cpp:200] relu6 needs backward computation.
- I0802 16:51:31.882977 2858 net.cpp:200] fc6 needs backward computation.
- I0802 16:51:31.882979 2858 net.cpp:200] pool5 needs backward computation.
- I0802 16:51:31.882982 2858 net.cpp:200] relu5 needs backward computation.
- I0802 16:51:31.882999 2858 net.cpp:200] conv5 needs backward computation.
- I0802 16:51:31.883002 2858 net.cpp:200] relu4 needs backward computation.
- I0802 16:51:31.883004 2858 net.cpp:200] conv4 needs backward computation.
- I0802 16:51:31.883008 2858 net.cpp:200] relu3 needs backward computation.
- I0802 16:51:31.883024 2858 net.cpp:200] conv3 needs backward computation.
- I0802 16:51:31.883028 2858 net.cpp:200] pool2 needs backward computation.
- I0802 16:51:31.883031 2858 net.cpp:200] norm2 needs backward computation.
- I0802 16:51:31.883034 2858 net.cpp:200] relu2 needs backward computation.
- I0802 16:51:31.883038 2858 net.cpp:200] conv2 needs backward computation.
- I0802 16:51:31.883041 2858 net.cpp:200] pool1 needs backward computation.
- I0802 16:51:31.883056 2858 net.cpp:200] norm1 needs backward computation.
- I0802 16:51:31.883087 2858 net.cpp:200] relu1 needs backward computation.
- I0802 16:51:31.883091 2858 net.cpp:200] scale1 needs backward computation.
- I0802 16:51:31.883095 2858 net.cpp:200] bn1 needs backward computation.
- I0802 16:51:31.883110 2858 net.cpp:200] conv1 needs backward computation.
- I0802 16:51:31.883112 2858 net.cpp:202] data does not need backward computation.
- I0802 16:51:31.883116 2858 net.cpp:244] This network produces output loss
- I0802 16:51:31.883142 2858 net.cpp:257] Network initialization done.
- I0802 16:51:31.883363 2858 upgrade_proto.cpp:77] Attempting to upgrade batch norm layers using deprecated params: examples/alexnetfinetune/alexnetsade-bn.prototxt
- I0802 16:51:31.883368 2858 upgrade_proto.cpp:80] Successfully upgraded batch norm layers using deprecated params.
- I0802 16:51:31.883388 2858 solver.cpp:173] Creating test net (#0) specified by net file: examples/alexnetfinetune/alexnetsade-bn.prototxt
- I0802 16:51:31.883411 2858 net.cpp:296] The NetState phase (1) differed from the phase (0) specified by a rule in layer data
- I0802 16:51:31.883577 2858 net.cpp:53] Initializing net from parameters:
- name: "AlexNet"
- state {
- phase: TEST
- }
- layer {
- name: "data"
- type: "Data"
- top: "data"
- top: "label"
- include {
- phase: TEST
- }
- transform_param {
- mirror: false
- crop_size: 227
- mean_file: "examples/Mydataset_test_lmdb/mean_imagetest.binaryproto"
- }
- data_param {
- source: "examples/Mydataset_test_lmdb"
- batch_size: 50
- backend: LMDB
- }
- }
- layer {
- name: "conv1"
- type: "Convolution"
- bottom: "data"
- top: "conv1"
- param {
- lr_mult: 1
- }
- convolution_param {
- num_output: 64
- bias_term: true
- pad: 1
- kernel_size: 3
- stride: 1
- weight_filler {
- type: "xavier"
- }
- }
- }
- layer {
- name: "bn1"
- type: "BatchNorm"
- bottom: "conv1"
- top: "conv1"
- param {
- lr_mult: 0
- decay_mult: 0
- }
- param {
- lr_mult: 0
- decay_mult: 0
- }
- param {
- lr_mult: 0
- decay_mult: 0
- }
- }
- layer {
- name: "scale1"
- type: "Scale"
- bottom: "conv1"
- top: "conv1"
- scale_param {
- bias_term: true
- }
- }
- layer {
- name: "relu1"
- type: "ReLU"
- bottom: "conv1"
- top: "conv1"
- }
- layer {
- name: "norm1"
- type: "LRN"
- bottom: "conv1"
- top: "norm1"
- lrn_param {
- local_size: 5
- alpha: 0.0001
- beta: 0.75
- }
- }
- layer {
- name: "pool1"
- type: "Pooling"
- bottom: "norm1"
- top: "pool1"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 2
- }
- }
- layer {
- name: "conv2"
- type: "Convolution"
- bottom: "pool1"
- top: "conv2"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 256
- pad: 2
- kernel_size: 5
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu2"
- type: "ReLU"
- bottom: "conv2"
- top: "conv2"
- }
- layer {
- name: "norm2"
- type: "LRN"
- bottom: "conv2"
- top: "norm2"
- lrn_param {
- local_size: 5
- alpha: 0.0001
- beta: 0.75
- }
- }
- layer {
- name: "pool2"
- type: "Pooling"
- bottom: "norm2"
- top: "pool2"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 2
- }
- }
- layer {
- name: "conv3"
- type: "Convolution"
- bottom: "pool2"
- top: "conv3"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 384
- pad: 1
- kernel_size: 3
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0
- }
- }
- }
- layer {
- name: "relu3"
- type: "ReLU"
- bottom: "conv3"
- top: "conv3"
- }
- layer {
- name: "conv4"
- type: "Convolution"
- bottom: "conv3"
- top: "conv4"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 384
- pad: 1
- kernel_size: 3
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu4"
- type: "ReLU"
- bottom: "conv4"
- top: "conv4"
- }
- layer {
- name: "conv5"
- type: "Convolution"
- bottom: "conv4"
- top: "conv5"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 256
- pad: 1
- kernel_size: 3
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu5"
- type: "ReLU"
- bottom: "conv5"
- top: "conv5"
- }
- layer {
- name: "pool5"
- type: "Pooling"
- bottom: "conv5"
- top: "pool5"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 2
- }
- }
- layer {
- name: "fc6"
- type: "InnerProduct"
- bottom: "pool5"
- top: "fc6"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- inner_product_param {
- num_output: 4096
- weight_filler {
- type: "gaussian"
- std: 0.005
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu6"
- type: "ReLU"
- bottom: "fc6"
- top: "fc6"
- }
- layer {
- name: "drop6"
- type: "Dropout"
- bottom: "fc6"
- top: "fc6"
- dropout_param {
- dropout_ratio: 0.5
- }
- }
- layer {
- name: "fc7"
- type: "InnerProduct"
- bottom: "fc6"
- top: "fc7"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- inner_product_param {
- num_output: 4096
- weight_filler {
- type: "gaussian"
- std: 0.005
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu7"
- type: "ReLU"
- bottom: "fc7"
- top: "fc7"
- }
- layer {
- name: "drop7"
- type: "Dropout"
- bottom: "fc7"
- top: "fc7"
- dropout_param {
- dropout_ratio: 0.5
- }
- }
- layer {
- name: "fc8"
- type: "InnerProduct"
- bottom: "fc7"
- top: "fc8"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- inner_product_param {
- num_output: 1000
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0
- }
- }
- }
- layer {
- name: "accuracy"
- type: "Accuracy"
- bottom: "fc8"
- bottom: "label"
- top: "accuracy"
- include {
- phase: TEST
- }
- }
- layer {
- name: "loss"
- type: "SoftmaxWithLoss"
- bottom: "fc8"
- bottom: "label"
- top: "loss"
- }
- I0802 16:51:31.883865 2858 layer_factory.hpp:77] Creating layer data
- I0802 16:51:31.903520 2858 db_lmdb.cpp:35] Opened lmdb examples/Mydataset_test_lmdb
- I0802 16:51:31.915539 2858 net.cpp:86] Creating Layer data
- I0802 16:51:31.915576 2858 net.cpp:382] data -> data
- I0802 16:51:31.915601 2858 net.cpp:382] data -> label
- I0802 16:51:31.915618 2858 data_transformer.cpp:25] Loading mean file from: examples/Mydataset_test_lmdb/mean_imagetest.binaryproto
- I0802 16:51:31.949586 2858 data_layer.cpp:45] output data size: 50,3,227,227
- I0802 16:51:32.009131 2858 net.cpp:124] Setting up data
- I0802 16:51:32.009177 2858 net.cpp:131] Top shape: 50 3 227 227 (7729350)
- I0802 16:51:32.009181 2858 net.cpp:131] Top shape: 50 (50)
- I0802 16:51:32.009186 2858 net.cpp:139] Memory required for data: 30917600
- I0802 16:51:32.009212 2858 layer_factory.hpp:77] Creating layer label_data_1_split
- I0802 16:51:32.009222 2858 net.cpp:86] Creating Layer label_data_1_split
- I0802 16:51:32.009227 2858 net.cpp:408] label_data_1_split <- label
- I0802 16:51:32.009232 2858 net.cpp:382] label_data_1_split -> label_data_1_split_0
- I0802 16:51:32.009241 2858 net.cpp:382] label_data_1_split -> label_data_1_split_1
- I0802 16:51:32.009376 2858 net.cpp:124] Setting up label_data_1_split
- I0802 16:51:32.009400 2858 net.cpp:131] Top shape: 50 (50)
- I0802 16:51:32.009403 2858 net.cpp:131] Top shape: 50 (50)
- I0802 16:51:32.009407 2858 net.cpp:139] Memory required for data: 30918000
- I0802 16:51:32.009428 2858 layer_factory.hpp:77] Creating layer conv1
- I0802 16:51:32.009438 2858 net.cpp:86] Creating Layer conv1
- I0802 16:51:32.009441 2858 net.cpp:408] conv1 <- data
- I0802 16:51:32.009446 2858 net.cpp:382] conv1 -> conv1
- I0802 16:51:32.012914 2858 net.cpp:124] Setting up conv1
- I0802 16:51:32.012955 2858 net.cpp:131] Top shape: 50 64 227 227 (164892800)
- I0802 16:51:32.012959 2858 net.cpp:139] Memory required for data: 690489200
- I0802 16:51:32.012984 2858 layer_factory.hpp:77] Creating layer bn1
- I0802 16:51:32.012991 2858 net.cpp:86] Creating Layer bn1
- I0802 16:51:32.012995 2858 net.cpp:408] bn1 <- conv1
- I0802 16:51:32.013000 2858 net.cpp:369] bn1 -> conv1 (in-place)
- I0802 16:51:32.013191 2858 net.cpp:124] Setting up bn1
- I0802 16:51:32.013196 2858 net.cpp:131] Top shape: 50 64 227 227 (164892800)
- I0802 16:51:32.013213 2858 net.cpp:139] Memory required for data: 1350060400
- I0802 16:51:32.013221 2858 layer_factory.hpp:77] Creating layer scale1
- I0802 16:51:32.013226 2858 net.cpp:86] Creating Layer scale1
- I0802 16:51:32.013228 2858 net.cpp:408] scale1 <- conv1
- I0802 16:51:32.013233 2858 net.cpp:369] scale1 -> conv1 (in-place)
- I0802 16:51:32.013275 2858 layer_factory.hpp:77] Creating layer scale1
- I0802 16:51:32.013424 2858 net.cpp:124] Setting up scale1
- I0802 16:51:32.013429 2858 net.cpp:131] Top shape: 50 64 227 227 (164892800)
- I0802 16:51:32.013433 2858 net.cpp:139] Memory required for data: 2009631600
- I0802 16:51:32.013438 2858 layer_factory.hpp:77] Creating layer relu1
- I0802 16:51:32.013443 2858 net.cpp:86] Creating Layer relu1
- I0802 16:51:32.013447 2858 net.cpp:408] relu1 <- conv1
- I0802 16:51:32.013450 2858 net.cpp:369] relu1 -> conv1 (in-place)
- I0802 16:51:32.013588 2858 net.cpp:124] Setting up relu1
- I0802 16:51:32.013594 2858 net.cpp:131] Top shape: 50 64 227 227 (164892800)
- I0802 16:51:32.013597 2858 net.cpp:139] Memory required for data: 2669202800
- I0802 16:51:32.013600 2858 layer_factory.hpp:77] Creating layer norm1
- I0802 16:51:32.013604 2858 net.cpp:86] Creating Layer norm1
- I0802 16:51:32.013607 2858 net.cpp:408] norm1 <- conv1
- I0802 16:51:32.013612 2858 net.cpp:382] norm1 -> norm1
- I0802 16:51:32.013787 2858 net.cpp:124] Setting up norm1
- I0802 16:51:32.013794 2858 net.cpp:131] Top shape: 50 64 227 227 (164892800)
- I0802 16:51:32.013797 2858 net.cpp:139] Memory required for data: 3328774000
- I0802 16:51:32.013800 2858 layer_factory.hpp:77] Creating layer pool1
- I0802 16:51:32.013806 2858 net.cpp:86] Creating Layer pool1
- I0802 16:51:32.013809 2858 net.cpp:408] pool1 <- norm1
- I0802 16:51:32.013813 2858 net.cpp:382] pool1 -> pool1
- I0802 16:51:32.013840 2858 net.cpp:124] Setting up pool1
- I0802 16:51:32.013845 2858 net.cpp:131] Top shape: 50 64 113 113 (40860800)
- I0802 16:51:32.013849 2858 net.cpp:139] Memory required for data: 3492217200
- I0802 16:51:32.013851 2858 layer_factory.hpp:77] Creating layer conv2
- I0802 16:51:32.013859 2858 net.cpp:86] Creating Layer conv2
- I0802 16:51:32.013862 2858 net.cpp:408] conv2 <- pool1
- I0802 16:51:32.013867 2858 net.cpp:382] conv2 -> conv2
- I0802 16:51:32.017405 2858 net.cpp:124] Setting up conv2
- I0802 16:51:32.017417 2858 net.cpp:131] Top shape: 50 256 113 113 (163443200)
- I0802 16:51:32.017421 2858 net.cpp:139] Memory required for data: 4145990000
- I0802 16:51:32.017431 2858 layer_factory.hpp:77] Creating layer relu2
- I0802 16:51:32.017436 2858 net.cpp:86] Creating Layer relu2
- I0802 16:51:32.017439 2858 net.cpp:408] relu2 <- conv2
- I0802 16:51:32.017444 2858 net.cpp:369] relu2 -> conv2 (in-place)
- I0802 16:51:32.017783 2858 net.cpp:124] Setting up relu2
- I0802 16:51:32.017791 2858 net.cpp:131] Top shape: 50 256 113 113 (163443200)
- I0802 16:51:32.017794 2858 net.cpp:139] Memory required for data: 4799762800
- I0802 16:51:32.017798 2858 layer_factory.hpp:77] Creating layer norm2
- I0802 16:51:32.017803 2858 net.cpp:86] Creating Layer norm2
- I0802 16:51:32.017807 2858 net.cpp:408] norm2 <- conv2
- I0802 16:51:32.017810 2858 net.cpp:382] norm2 -> norm2
- I0802 16:51:32.017953 2858 net.cpp:124] Setting up norm2
- I0802 16:51:32.017959 2858 net.cpp:131] Top shape: 50 256 113 113 (163443200)
- I0802 16:51:32.017961 2858 net.cpp:139] Memory required for data: 5453535600
- I0802 16:51:32.017964 2858 layer_factory.hpp:77] Creating layer pool2
- I0802 16:51:32.017969 2858 net.cpp:86] Creating Layer pool2
- I0802 16:51:32.017999 2858 net.cpp:408] pool2 <- norm2
- I0802 16:51:32.018003 2858 net.cpp:382] pool2 -> pool2
- I0802 16:51:32.018060 2858 net.cpp:124] Setting up pool2
- I0802 16:51:32.018065 2858 net.cpp:131] Top shape: 50 256 56 56 (40140800)
- I0802 16:51:32.018067 2858 net.cpp:139] Memory required for data: 5614098800
- I0802 16:51:32.018081 2858 layer_factory.hpp:77] Creating layer conv3
- I0802 16:51:32.018087 2858 net.cpp:86] Creating Layer conv3
- I0802 16:51:32.018105 2858 net.cpp:408] conv3 <- pool2
- I0802 16:51:32.018107 2858 net.cpp:382] conv3 -> conv3
- I0802 16:51:32.026690 2858 net.cpp:124] Setting up conv3
- I0802 16:51:32.026705 2858 net.cpp:131] Top shape: 50 384 56 56 (60211200)
- I0802 16:51:32.026707 2858 net.cpp:139] Memory required for data: 5854943600
- I0802 16:51:32.026727 2858 layer_factory.hpp:77] Creating layer relu3
- I0802 16:51:32.026748 2858 net.cpp:86] Creating Layer relu3
- I0802 16:51:32.026751 2858 net.cpp:408] relu3 <- conv3
- I0802 16:51:32.026772 2858 net.cpp:369] relu3 -> conv3 (in-place)
- I0802 16:51:32.026926 2858 net.cpp:124] Setting up relu3
- I0802 16:51:32.026932 2858 net.cpp:131] Top shape: 50 384 56 56 (60211200)
- I0802 16:51:32.026933 2858 net.cpp:139] Memory required for data: 6095788400
- I0802 16:51:32.026937 2858 layer_factory.hpp:77] Creating layer conv4
- I0802 16:51:32.026958 2858 net.cpp:86] Creating Layer conv4
- I0802 16:51:32.026962 2858 net.cpp:408] conv4 <- conv3
- I0802 16:51:32.026965 2858 net.cpp:382] conv4 -> conv4
- I0802 16:51:32.034037 2858 net.cpp:124] Setting up conv4
- I0802 16:51:32.034052 2858 net.cpp:131] Top shape: 50 384 56 56 (60211200)
- I0802 16:51:32.034054 2858 net.cpp:139] Memory required for data: 6336633200
- I0802 16:51:32.034060 2858 layer_factory.hpp:77] Creating layer relu4
- I0802 16:51:32.034080 2858 net.cpp:86] Creating Layer relu4
- I0802 16:51:32.034085 2858 net.cpp:408] relu4 <- conv4
- I0802 16:51:32.034106 2858 net.cpp:369] relu4 -> conv4 (in-place)
- I0802 16:51:32.034279 2858 net.cpp:124] Setting up relu4
- I0802 16:51:32.034286 2858 net.cpp:131] Top shape: 50 384 56 56 (60211200)
- I0802 16:51:32.034287 2858 net.cpp:139] Memory required for data: 6577478000
- I0802 16:51:32.034289 2858 layer_factory.hpp:77] Creating layer conv5
- I0802 16:51:32.034312 2858 net.cpp:86] Creating Layer conv5
- I0802 16:51:32.034314 2858 net.cpp:408] conv5 <- conv4
- I0802 16:51:32.034318 2858 net.cpp:382] conv5 -> conv5
- I0802 16:51:32.040576 2858 net.cpp:124] Setting up conv5
- I0802 16:51:32.040592 2858 net.cpp:131] Top shape: 50 256 56 56 (40140800)
- I0802 16:51:32.040611 2858 net.cpp:139] Memory required for data: 6738041200
- I0802 16:51:32.040618 2858 layer_factory.hpp:77] Creating layer relu5
- I0802 16:51:32.040626 2858 net.cpp:86] Creating Layer relu5
- I0802 16:51:32.040629 2858 net.cpp:408] relu5 <- conv5
- I0802 16:51:32.040635 2858 net.cpp:369] relu5 -> conv5 (in-place)
- I0802 16:51:32.040817 2858 net.cpp:124] Setting up relu5
- I0802 16:51:32.040823 2858 net.cpp:131] Top shape: 50 256 56 56 (40140800)
- I0802 16:51:32.040825 2858 net.cpp:139] Memory required for data: 6898604400
- I0802 16:51:32.040828 2858 layer_factory.hpp:77] Creating layer pool5
- I0802 16:51:32.040848 2858 net.cpp:86] Creating Layer pool5
- I0802 16:51:32.040849 2858 net.cpp:408] pool5 <- conv5
- I0802 16:51:32.040853 2858 net.cpp:382] pool5 -> pool5
- I0802 16:51:32.040920 2858 net.cpp:124] Setting up pool5
- I0802 16:51:32.040925 2858 net.cpp:131] Top shape: 50 256 28 28 (10035200)
- I0802 16:51:32.040927 2858 net.cpp:139] Memory required for data: 6938745200
- I0802 16:51:32.040930 2858 layer_factory.hpp:77] Creating layer fc6
- I0802 16:51:32.040936 2858 net.cpp:86] Creating Layer fc6
- I0802 16:51:32.040938 2858 net.cpp:408] fc6 <- pool5
- I0802 16:51:32.040942 2858 net.cpp:382] fc6 -> fc6
- I0802 16:52:04.265542 2858 net.cpp:124] Setting up fc6
- I0802 16:52:07.463881 2858 net.cpp:131] Top shape: 50 4096 (204800)
- I0802 16:52:07.463922 2858 net.cpp:139] Memory required for data: 6939564400
- I0802 16:52:08.317765 2858 layer_factory.hpp:77] Creating layer relu6
- I0802 16:52:08.367188 2858 net.cpp:86] Creating Layer relu6
- I0802 16:52:08.424886 2858 net.cpp:408] relu6 <- fc6
- I0802 16:52:08.596489 2858 net.cpp:369] relu6 -> fc6 (in-place)
- I0802 16:52:16.592660 2858 net.cpp:124] Setting up relu6
- I0802 16:52:16.592710 2858 net.cpp:131] Top shape: 50 4096 (204800)
- I0802 16:52:16.592722 2858 net.cpp:139] Memory required for data: 6940383600
- I0802 16:52:16.607672 2858 layer_factory.hpp:77] Creating layer drop6
- I0802 16:52:16.630497 2858 net.cpp:86] Creating Layer drop6
- I0802 16:52:16.630540 2858 net.cpp:408] drop6 <- fc6
- I0802 16:52:16.630563 2858 net.cpp:369] drop6 -> fc6 (in-place)
- I0802 16:52:16.676877 2858 net.cpp:124] Setting up drop6
- I0802 16:52:16.676910 2858 net.cpp:131] Top shape: 50 4096 (204800)
- I0802 16:52:16.676918 2858 net.cpp:139] Memory required for data: 6941202800
- I0802 16:52:16.676931 2858 layer_factory.hpp:77] Creating layer fc7
- I0802 16:52:16.682255 2858 net.cpp:86] Creating Layer fc7
- I0802 16:52:16.682273 2858 net.cpp:408] fc7 <- fc6
- I0802 16:52:16.682291 2858 net.cpp:382] fc7 -> fc7
- I0802 16:52:17.076421 2858 net.cpp:124] Setting up fc7
- I0802 16:52:17.076830 2858 net.cpp:131] Top shape: 50 4096 (204800)
- I0802 16:52:17.076838 2858 net.cpp:139] Memory required for data: 6942022000
- I0802 16:52:17.076848 2858 layer_factory.hpp:77] Creating layer relu7
- I0802 16:52:17.076869 2858 net.cpp:86] Creating Layer relu7
- I0802 16:52:17.076887 2858 net.cpp:408] relu7 <- fc7
- I0802 16:52:17.076892 2858 net.cpp:369] relu7 -> fc7 (in-place)
- I0802 16:52:17.127794 2858 net.cpp:124] Setting up relu7
- I0802 16:52:17.127849 2858 net.cpp:131] Top shape: 50 4096 (204800)
- I0802 16:52:17.127859 2858 net.cpp:139] Memory required for data: 6942841200
- I0802 16:52:17.127871 2858 layer_factory.hpp:77] Creating layer drop7
- I0802 16:52:17.127898 2858 net.cpp:86] Creating Layer drop7
- I0802 16:52:17.127913 2858 net.cpp:408] drop7 <- fc7
- I0802 16:52:17.127929 2858 net.cpp:369] drop7 -> fc7 (in-place)
- I0802 16:52:17.128054 2858 net.cpp:124] Setting up drop7
- I0802 16:52:17.128072 2858 net.cpp:131] Top shape: 50 4096 (204800)
- I0802 16:52:17.128082 2858 net.cpp:139] Memory required for data: 6943660400
- I0802 16:52:17.128088 2858 layer_factory.hpp:77] Creating layer fc8
- I0802 16:52:17.128105 2858 net.cpp:86] Creating Layer fc8
- I0802 16:52:17.128116 2858 net.cpp:408] fc8 <- fc7
- I0802 16:52:17.128137 2858 net.cpp:382] fc8 -> fc8
- I0802 16:52:17.330720 2858 net.cpp:124] Setting up fc8
- I0802 16:52:17.330772 2858 net.cpp:131] Top shape: 50 1000 (50000)
- I0802 16:52:17.330781 2858 net.cpp:139] Memory required for data: 6943860400
- I0802 16:52:17.330806 2858 layer_factory.hpp:77] Creating layer fc8_fc8_0_split
- I0802 16:52:17.420990 2858 net.cpp:86] Creating Layer fc8_fc8_0_split
- I0802 16:52:17.421036 2858 net.cpp:408] fc8_fc8_0_split <- fc8
- I0802 16:52:17.421066 2858 net.cpp:382] fc8_fc8_0_split -> fc8_fc8_0_split_0
- I0802 16:52:17.421098 2858 net.cpp:382] fc8_fc8_0_split -> fc8_fc8_0_split_1
- I0802 16:52:17.433820 2858 net.cpp:124] Setting up fc8_fc8_0_split
- I0802 16:52:17.433867 2858 net.cpp:131] Top shape: 50 1000 (50000)
- I0802 16:52:17.433878 2858 net.cpp:131] Top shape: 50 1000 (50000)
- I0802 16:52:17.433887 2858 net.cpp:139] Memory required for data: 6944260400
- I0802 16:52:17.433907 2858 layer_factory.hpp:77] Creating layer accuracy
- I0802 16:52:17.490643 2858 net.cpp:86] Creating Layer accuracy
- I0802 16:52:17.490690 2858 net.cpp:408] accuracy <- fc8_fc8_0_split_0
- I0802 16:52:17.490710 2858 net.cpp:408] accuracy <- label_data_1_split_0
- I0802 16:52:17.490727 2858 net.cpp:382] accuracy -> accuracy
- I0802 16:52:17.525151 2858 net.cpp:124] Setting up accuracy
- I0802 16:52:17.525197 2858 net.cpp:131] Top shape: (1)
- I0802 16:52:17.525207 2858 net.cpp:139] Memory required for data: 6944260404
- I0802 16:52:17.525218 2858 layer_factory.hpp:77] Creating layer loss
- I0802 16:52:17.525286 2858 net.cpp:86] Creating Layer loss
- I0802 16:52:17.525297 2858 net.cpp:408] loss <- fc8_fc8_0_split_1
- I0802 16:52:17.525315 2858 net.cpp:408] loss <- label_data_1_split_1
- I0802 16:52:17.629770 2858 net.cpp:382] loss -> loss
- I0802 16:52:17.822856 2858 layer_factory.hpp:77] Creating layer loss
- I0802 16:52:18.057018 2858 net.cpp:124] Setting up loss
- I0802 16:52:18.057070 2858 net.cpp:131] Top shape: (1)
- I0802 16:52:18.057077 2858 net.cpp:134] with loss weight 1
- I0802 16:52:18.161309 2858 net.cpp:139] Memory required for data: 6944260408
- I0802 16:52:18.161334 2858 net.cpp:200] loss needs backward computation.
- I0802 16:52:18.161356 2858 net.cpp:202] accuracy does not need backward computation.
- I0802 16:52:18.161370 2858 net.cpp:200] fc8_fc8_0_split needs backward computation.
- I0802 16:52:18.161378 2858 net.cpp:200] fc8 needs backward computation.
- I0802 16:52:18.161389 2858 net.cpp:200] drop7 needs backward computation.
- I0802 16:52:18.161402 2858 net.cpp:200] relu7 needs backward computation.
- I0802 16:52:18.161413 2858 net.cpp:200] fc7 needs backward computation.
- I0802 16:52:18.161423 2858 net.cpp:200] drop6 needs backward computation.
- I0802 16:52:18.161433 2858 net.cpp:200] relu6 needs backward computation.
- I0802 16:52:18.161445 2858 net.cpp:200] fc6 needs backward computation.
- I0802 16:52:18.161454 2858 net.cpp:200] pool5 needs backward computation.
- I0802 16:52:18.161464 2858 net.cpp:200] relu5 needs backward computation.
- I0802 16:52:18.161475 2858 net.cpp:200] conv5 needs backward computation.
- I0802 16:52:18.172785 2858 net.cpp:200] relu4 needs backward computation.
- I0802 16:52:18.172827 2858 net.cpp:200] conv4 needs backward computation.
- I0802 16:52:18.214609 2858 net.cpp:200] relu3 needs backward computation.
- I0802 16:52:18.219866 2858 net.cpp:200] conv3 needs backward computation.
- I0802 16:52:18.219916 2858 net.cpp:200] pool2 needs backward computation.
- I0802 16:52:18.219926 2858 net.cpp:200] norm2 needs backward computation.
- I0802 16:52:18.219935 2858 net.cpp:200] relu2 needs backward computation.
- I0802 16:52:18.219943 2858 net.cpp:200] conv2 needs backward computation.
- I0802 16:52:18.219959 2858 net.cpp:200] pool1 needs backward computation.
- I0802 16:52:18.229528 2858 net.cpp:200] norm1 needs backward computation.
- I0802 16:52:18.229575 2858 net.cpp:200] relu1 needs backward computation.
- I0802 16:52:18.229585 2858 net.cpp:200] scale1 needs backward computation.
- I0802 16:52:18.229595 2858 net.cpp:200] bn1 needs backward computation.
- I0802 16:52:18.229602 2858 net.cpp:200] conv1 needs backward computation.
- I0802 16:52:18.229616 2858 net.cpp:202] label_data_1_split does not need backward computation.
- I0802 16:52:18.229629 2858 net.cpp:202] data does not need backward computation.
- I0802 16:52:18.229641 2858 net.cpp:244] This network produces output accuracy
- I0802 16:52:18.229655 2858 net.cpp:244] This network produces output loss
- I0802 16:52:18.229758 2858 net.cpp:257] Network initialization done.
- I0802 16:52:18.840412 2858 solver.cpp:56] Solver scaffolding done.
- I0802 16:52:18.998049 2858 caffe.cpp:248] Starting Optimization
- I0802 16:52:18.998095 2858 solver.cpp:273] Solving AlexNet
- I0802 16:52:18.998103 2858 solver.cpp:274] Learning Rate Policy: step
- I0802 16:52:19.448653 2858 solver.cpp:331] Iteration 0, Testing net (#0)
- F0802 16:52:22.157035 2858 syncedmem.cpp:71] Check failed: error == cudaSuccess (2 vs. 0) out of memory
- *** Check failure stack trace: ***
- @ 0x7f856777c5cd google::LogMessage::Fail()
- @ 0x7f856777e433 google::LogMessage::SendToLog()
- @ 0x7f856777c15b google::LogMessage::Flush()
- @ 0x7f856777ee1e google::LogMessageFatal::~LogMessageFatal()
- @ 0x7f8567db3ad8 caffe::SyncedMemory::mutable_gpu_data()
- @ 0x7f8567f280f2 caffe::Blob<>::mutable_gpu_data()
- @ 0x7f8567f5be48 caffe::CuDNNConvolutionLayer<>::Forward_gpu()
- @ 0x7f8567dc28b1 caffe::Net<>::ForwardFromTo()
- @ 0x7f8567dc29b7 caffe::Net<>::Forward()
- @ 0x7f8567f1acb2 caffe::Solver<>::Test()
- @ 0x7f8567f1b6ce caffe::Solver<>::TestAll()
- @ 0x7f8567f1ec27 caffe::Solver<>::Step()
- @ 0x7f8567f1eeea caffe::Solver<>::Solve()
- @ 0x40aeb4 train()
- @ 0x4075a8 main
- @ 0x7f8565f13830 __libc_start_main
- @ 0x407e79 _start
- @ (nil) (unknown)
- Aborted (core dumped)
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement