Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- ninja@ninja:~/caffe$ ./build/tools/caffe train --solver=models/bvlc_alexnet/solvermnist.prototxt
- I1231 09:58:24.341311 3750 caffe.cpp:217] Using GPUs 0
- I1231 09:58:24.407222 3750 caffe.cpp:222] GPU 0: GeForce GTX 1070
- I1231 09:58:24.956543 3750 solver.cpp:48] Initializing solver from parameters:
- test_iter: 1000
- test_interval: 1000
- base_lr: 0.01
- display: 20
- max_iter: 2000
- lr_policy: "step"
- gamma: 0.1
- momentum: 0.9
- weight_decay: 0.0005
- stepsize: 1000
- snapshot: 1000
- snapshot_prefix: "models/bvlc_alexnet/caffe_alexnet_train"
- solver_mode: GPU
- device_id: 0
- net: "models/bvlc_alexnet/train_valmnist.prototxt"
- train_state {
- level: 0
- stage: ""
- }
- I1231 09:58:24.956770 3750 solver.cpp:91] Creating training net from net file: models/bvlc_alexnet/train_valmnist.prototxt
- I1231 09:58:24.957945 3750 net.cpp:322] The NetState phase (0) differed from the phase (1) specified by a rule in layer data
- I1231 09:58:24.957989 3750 net.cpp:322] The NetState phase (0) differed from the phase (1) specified by a rule in layer accuracy
- I1231 09:58:24.958338 3750 net.cpp:58] Initializing net from parameters:
- name: "AlexNet"
- state {
- phase: TRAIN
- level: 0
- stage: ""
- }
- layer {
- name: "data"
- type: "Data"
- top: "data"
- top: "label"
- include {
- phase: TRAIN
- }
- transform_param {
- scale: 0.00390625
- }
- data_param {
- source: "examples/mnist/mnist_train_lmdb"
- batch_size: 64
- backend: LMDB
- }
- }
- layer {
- name: "conv1"
- type: "Convolution"
- bottom: "data"
- top: "conv1"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 96
- kernel_size: 3
- stride: 1
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0
- }
- }
- }
- layer {
- name: "relu1"
- type: "ReLU"
- bottom: "conv1"
- top: "conv1"
- }
- layer {
- name: "norm1"
- type: "LRN"
- bottom: "conv1"
- top: "norm1"
- lrn_param {
- local_size: 5
- alpha: 0.0001
- beta: 0.75
- }
- }
- layer {
- name: "pool1"
- type: "Pooling"
- bottom: "norm1"
- top: "pool1"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 1
- }
- }
- layer {
- name: "conv2"
- type: "Convolution"
- bottom: "pool1"
- top: "conv2"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 256
- pad: 2
- kernel_size: 3
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu2"
- type: "ReLU"
- bottom: "conv2"
- top: "conv2"
- }
- layer {
- name: "norm2"
- type: "LRN"
- bottom: "conv2"
- top: "norm2"
- lrn_param {
- local_size: 5
- alpha: 0.0001
- beta: 0.75
- }
- }
- layer {
- name: "pool2"
- type: "Pooling"
- bottom: "norm2"
- top: "pool2"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 2
- }
- }
- layer {
- name: "conv3"
- type: "Convolution"
- bottom: "pool2"
- top: "conv3"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 384
- pad: 1
- kernel_size: 3
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0
- }
- }
- }
- layer {
- name: "relu3"
- type: "ReLU"
- bottom: "conv3"
- top: "conv3"
- }
- layer {
- name: "conv4"
- type: "Convolution"
- bottom: "conv3"
- top: "conv4"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 384
- pad: 1
- kernel_size: 3
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu4"
- type: "ReLU"
- bottom: "conv4"
- top: "conv4"
- }
- layer {
- name: "conv5"
- type: "Convolution"
- bottom: "conv4"
- top: "conv5"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 256
- pad: 1
- kernel_size: 3
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu5"
- type: "ReLU"
- bottom: "conv5"
- top: "conv5"
- }
- layer {
- name: "pool5"
- type: "Pooling"
- bottom: "conv5"
- top: "pool5"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 2
- }
- }
- layer {
- name: "fc6"
- type: "InnerProduct"
- bottom: "pool5"
- top: "fc6"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- inner_product_param {
- num_output: 4096
- weight_filler {
- type: "gaussian"
- std: 0.005
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu6"
- type: "ReLU"
- bottom: "fc6"
- top: "fc6"
- }
- layer {
- name: "drop6"
- type: "Dropout"
- bottom: "fc6"
- top: "fc6"
- dropout_param {
- dropout_ratio: 0.5
- }
- }
- layer {
- name: "fc7"
- type: "InnerProduct"
- bottom: "fc6"
- top: "fc7"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- inner_product_param {
- num_output: 4096
- weight_filler {
- type: "gaussian"
- std: 0.005
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu7"
- type: "ReLU"
- bottom: "fc7"
- top: "fc7"
- }
- layer {
- name: "drop7"
- type: "Dropout"
- bottom: "fc7"
- top: "fc7"
- dropout_param {
- dropout_ratio: 0.5
- }
- }
- layer {
- name: "fc8"
- type: "InnerProduct"
- bottom: "fc7"
- top: "fc8"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- inner_product_param {
- num_output: 1000
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0
- }
- }
- }
- layer {
- name: "loss"
- type: "SoftmaxWithLoss"
- bottom: "fc8"
- bottom: "label"
- top: "loss"
- }
- I1231 09:58:24.958807 3750 layer_factory.hpp:77] Creating layer data
- I1231 09:58:24.960191 3750 net.cpp:100] Creating Layer data
- I1231 09:58:24.960222 3750 net.cpp:408] data -> data
- I1231 09:58:24.960258 3750 net.cpp:408] data -> label
- I1231 09:58:24.962350 3759 db_lmdb.cpp:35] Opened lmdb examples/mnist/mnist_train_lmdb
- I1231 09:58:25.051169 3750 data_layer.cpp:41] output data size: 64,1,28,28
- I1231 09:58:25.053405 3750 net.cpp:150] Setting up data
- I1231 09:58:25.053428 3750 net.cpp:157] Top shape: 64 1 28 28 (50176)
- I1231 09:58:25.053438 3750 net.cpp:157] Top shape: 64 (64)
- I1231 09:58:25.053444 3750 net.cpp:165] Memory required for data: 200960
- I1231 09:58:25.053458 3750 layer_factory.hpp:77] Creating layer conv1
- I1231 09:58:25.053489 3750 net.cpp:100] Creating Layer conv1
- I1231 09:58:25.053498 3750 net.cpp:434] conv1 <- data
- I1231 09:58:25.053514 3750 net.cpp:408] conv1 -> conv1
- I1231 09:58:25.055207 3750 net.cpp:150] Setting up conv1
- I1231 09:58:25.055236 3750 net.cpp:157] Top shape: 64 96 26 26 (4153344)
- I1231 09:58:25.055241 3750 net.cpp:165] Memory required for data: 16814336
- I1231 09:58:25.055268 3750 layer_factory.hpp:77] Creating layer relu1
- I1231 09:58:25.055281 3750 net.cpp:100] Creating Layer relu1
- I1231 09:58:25.055289 3750 net.cpp:434] relu1 <- conv1
- I1231 09:58:25.055299 3750 net.cpp:395] relu1 -> conv1 (in-place)
- I1231 09:58:25.055310 3750 net.cpp:150] Setting up relu1
- I1231 09:58:25.055318 3750 net.cpp:157] Top shape: 64 96 26 26 (4153344)
- I1231 09:58:25.055326 3750 net.cpp:165] Memory required for data: 33427712
- I1231 09:58:25.055335 3750 layer_factory.hpp:77] Creating layer norm1
- I1231 09:58:25.055346 3750 net.cpp:100] Creating Layer norm1
- I1231 09:58:25.055352 3750 net.cpp:434] norm1 <- conv1
- I1231 09:58:25.055362 3750 net.cpp:408] norm1 -> norm1
- I1231 09:58:25.055415 3750 net.cpp:150] Setting up norm1
- I1231 09:58:25.055424 3750 net.cpp:157] Top shape: 64 96 26 26 (4153344)
- I1231 09:58:25.055430 3750 net.cpp:165] Memory required for data: 50041088
- I1231 09:58:25.055438 3750 layer_factory.hpp:77] Creating layer pool1
- I1231 09:58:25.055449 3750 net.cpp:100] Creating Layer pool1
- I1231 09:58:25.055454 3750 net.cpp:434] pool1 <- norm1
- I1231 09:58:25.055462 3750 net.cpp:408] pool1 -> pool1
- I1231 09:58:25.055522 3750 net.cpp:150] Setting up pool1
- I1231 09:58:25.055533 3750 net.cpp:157] Top shape: 64 96 24 24 (3538944)
- I1231 09:58:25.055563 3750 net.cpp:165] Memory required for data: 64196864
- I1231 09:58:25.055572 3750 layer_factory.hpp:77] Creating layer conv2
- I1231 09:58:25.055588 3750 net.cpp:100] Creating Layer conv2
- I1231 09:58:25.055593 3750 net.cpp:434] conv2 <- pool1
- I1231 09:58:25.055603 3750 net.cpp:408] conv2 -> conv2
- I1231 09:58:25.059763 3750 net.cpp:150] Setting up conv2
- I1231 09:58:25.059779 3750 net.cpp:157] Top shape: 64 256 26 26 (11075584)
- I1231 09:58:25.059784 3750 net.cpp:165] Memory required for data: 108499200
- I1231 09:58:25.059795 3750 layer_factory.hpp:77] Creating layer relu2
- I1231 09:58:25.059803 3750 net.cpp:100] Creating Layer relu2
- I1231 09:58:25.059809 3750 net.cpp:434] relu2 <- conv2
- I1231 09:58:25.059816 3750 net.cpp:395] relu2 -> conv2 (in-place)
- I1231 09:58:25.059824 3750 net.cpp:150] Setting up relu2
- I1231 09:58:25.059830 3750 net.cpp:157] Top shape: 64 256 26 26 (11075584)
- I1231 09:58:25.059835 3750 net.cpp:165] Memory required for data: 152801536
- I1231 09:58:25.059840 3750 layer_factory.hpp:77] Creating layer norm2
- I1231 09:58:25.059847 3750 net.cpp:100] Creating Layer norm2
- I1231 09:58:25.059851 3750 net.cpp:434] norm2 <- conv2
- I1231 09:58:25.059859 3750 net.cpp:408] norm2 -> norm2
- I1231 09:58:25.059898 3750 net.cpp:150] Setting up norm2
- I1231 09:58:25.059906 3750 net.cpp:157] Top shape: 64 256 26 26 (11075584)
- I1231 09:58:25.059908 3750 net.cpp:165] Memory required for data: 197103872
- I1231 09:58:25.059913 3750 layer_factory.hpp:77] Creating layer pool2
- I1231 09:58:25.059919 3750 net.cpp:100] Creating Layer pool2
- I1231 09:58:25.059923 3750 net.cpp:434] pool2 <- norm2
- I1231 09:58:25.059928 3750 net.cpp:408] pool2 -> pool2
- I1231 09:58:25.059962 3750 net.cpp:150] Setting up pool2
- I1231 09:58:25.059968 3750 net.cpp:157] Top shape: 64 256 13 13 (2768896)
- I1231 09:58:25.059972 3750 net.cpp:165] Memory required for data: 208179456
- I1231 09:58:25.059975 3750 layer_factory.hpp:77] Creating layer conv3
- I1231 09:58:25.059984 3750 net.cpp:100] Creating Layer conv3
- I1231 09:58:25.059988 3750 net.cpp:434] conv3 <- pool2
- I1231 09:58:25.059993 3750 net.cpp:408] conv3 -> conv3
- I1231 09:58:25.070734 3750 net.cpp:150] Setting up conv3
- I1231 09:58:25.070754 3750 net.cpp:157] Top shape: 64 384 13 13 (4153344)
- I1231 09:58:25.070755 3750 net.cpp:165] Memory required for data: 224792832
- I1231 09:58:25.070765 3750 layer_factory.hpp:77] Creating layer relu3
- I1231 09:58:25.070771 3750 net.cpp:100] Creating Layer relu3
- I1231 09:58:25.070775 3750 net.cpp:434] relu3 <- conv3
- I1231 09:58:25.070780 3750 net.cpp:395] relu3 -> conv3 (in-place)
- I1231 09:58:25.070787 3750 net.cpp:150] Setting up relu3
- I1231 09:58:25.070792 3750 net.cpp:157] Top shape: 64 384 13 13 (4153344)
- I1231 09:58:25.070796 3750 net.cpp:165] Memory required for data: 241406208
- I1231 09:58:25.070798 3750 layer_factory.hpp:77] Creating layer conv4
- I1231 09:58:25.070806 3750 net.cpp:100] Creating Layer conv4
- I1231 09:58:25.070809 3750 net.cpp:434] conv4 <- conv3
- I1231 09:58:25.070814 3750 net.cpp:408] conv4 -> conv4
- I1231 09:58:25.077191 3750 net.cpp:150] Setting up conv4
- I1231 09:58:25.077200 3750 net.cpp:157] Top shape: 64 384 13 13 (4153344)
- I1231 09:58:25.077203 3750 net.cpp:165] Memory required for data: 258019584
- I1231 09:58:25.077208 3750 layer_factory.hpp:77] Creating layer relu4
- I1231 09:58:25.077214 3750 net.cpp:100] Creating Layer relu4
- I1231 09:58:25.077215 3750 net.cpp:434] relu4 <- conv4
- I1231 09:58:25.077219 3750 net.cpp:395] relu4 -> conv4 (in-place)
- I1231 09:58:25.077224 3750 net.cpp:150] Setting up relu4
- I1231 09:58:25.077226 3750 net.cpp:157] Top shape: 64 384 13 13 (4153344)
- I1231 09:58:25.077229 3750 net.cpp:165] Memory required for data: 274632960
- I1231 09:58:25.077232 3750 layer_factory.hpp:77] Creating layer conv5
- I1231 09:58:25.077239 3750 net.cpp:100] Creating Layer conv5
- I1231 09:58:25.077242 3750 net.cpp:434] conv5 <- conv4
- I1231 09:58:25.077247 3750 net.cpp:408] conv5 -> conv5
- I1231 09:58:25.081475 3750 net.cpp:150] Setting up conv5
- I1231 09:58:25.081492 3750 net.cpp:157] Top shape: 64 256 13 13 (2768896)
- I1231 09:58:25.081511 3750 net.cpp:165] Memory required for data: 285708544
- I1231 09:58:25.081521 3750 layer_factory.hpp:77] Creating layer relu5
- I1231 09:58:25.081526 3750 net.cpp:100] Creating Layer relu5
- I1231 09:58:25.081531 3750 net.cpp:434] relu5 <- conv5
- I1231 09:58:25.081535 3750 net.cpp:395] relu5 -> conv5 (in-place)
- I1231 09:58:25.081542 3750 net.cpp:150] Setting up relu5
- I1231 09:58:25.081547 3750 net.cpp:157] Top shape: 64 256 13 13 (2768896)
- I1231 09:58:25.081552 3750 net.cpp:165] Memory required for data: 296784128
- I1231 09:58:25.081555 3750 layer_factory.hpp:77] Creating layer pool5
- I1231 09:58:25.081560 3750 net.cpp:100] Creating Layer pool5
- I1231 09:58:25.081564 3750 net.cpp:434] pool5 <- conv5
- I1231 09:58:25.081570 3750 net.cpp:408] pool5 -> pool5
- I1231 09:58:25.081604 3750 net.cpp:150] Setting up pool5
- I1231 09:58:25.081610 3750 net.cpp:157] Top shape: 64 256 6 6 (589824)
- I1231 09:58:25.081612 3750 net.cpp:165] Memory required for data: 299143424
- I1231 09:58:25.081616 3750 layer_factory.hpp:77] Creating layer fc6
- I1231 09:58:25.081622 3750 net.cpp:100] Creating Layer fc6
- I1231 09:58:25.081625 3750 net.cpp:434] fc6 <- pool5
- I1231 09:58:25.081629 3750 net.cpp:408] fc6 -> fc6
- I1231 09:58:25.361205 3750 net.cpp:150] Setting up fc6
- I1231 09:58:25.361223 3750 net.cpp:157] Top shape: 64 4096 (262144)
- I1231 09:58:25.361225 3750 net.cpp:165] Memory required for data: 300192000
- I1231 09:58:25.361232 3750 layer_factory.hpp:77] Creating layer relu6
- I1231 09:58:25.361239 3750 net.cpp:100] Creating Layer relu6
- I1231 09:58:25.361241 3750 net.cpp:434] relu6 <- fc6
- I1231 09:58:25.361245 3750 net.cpp:395] relu6 -> fc6 (in-place)
- I1231 09:58:25.361253 3750 net.cpp:150] Setting up relu6
- I1231 09:58:25.361256 3750 net.cpp:157] Top shape: 64 4096 (262144)
- I1231 09:58:25.361258 3750 net.cpp:165] Memory required for data: 301240576
- I1231 09:58:25.361261 3750 layer_factory.hpp:77] Creating layer drop6
- I1231 09:58:25.361265 3750 net.cpp:100] Creating Layer drop6
- I1231 09:58:25.361268 3750 net.cpp:434] drop6 <- fc6
- I1231 09:58:25.361270 3750 net.cpp:395] drop6 -> fc6 (in-place)
- I1231 09:58:25.361285 3750 net.cpp:150] Setting up drop6
- I1231 09:58:25.361289 3750 net.cpp:157] Top shape: 64 4096 (262144)
- I1231 09:58:25.361291 3750 net.cpp:165] Memory required for data: 302289152
- I1231 09:58:25.361292 3750 layer_factory.hpp:77] Creating layer fc7
- I1231 09:58:25.361296 3750 net.cpp:100] Creating Layer fc7
- I1231 09:58:25.361299 3750 net.cpp:434] fc7 <- fc6
- I1231 09:58:25.361302 3750 net.cpp:408] fc7 -> fc7
- I1231 09:58:25.485281 3750 net.cpp:150] Setting up fc7
- I1231 09:58:25.485301 3750 net.cpp:157] Top shape: 64 4096 (262144)
- I1231 09:58:25.485302 3750 net.cpp:165] Memory required for data: 303337728
- I1231 09:58:25.485311 3750 layer_factory.hpp:77] Creating layer relu7
- I1231 09:58:25.485316 3750 net.cpp:100] Creating Layer relu7
- I1231 09:58:25.485318 3750 net.cpp:434] relu7 <- fc7
- I1231 09:58:25.485323 3750 net.cpp:395] relu7 -> fc7 (in-place)
- I1231 09:58:25.485330 3750 net.cpp:150] Setting up relu7
- I1231 09:58:25.485333 3750 net.cpp:157] Top shape: 64 4096 (262144)
- I1231 09:58:25.485334 3750 net.cpp:165] Memory required for data: 304386304
- I1231 09:58:25.485337 3750 layer_factory.hpp:77] Creating layer drop7
- I1231 09:58:25.485340 3750 net.cpp:100] Creating Layer drop7
- I1231 09:58:25.485342 3750 net.cpp:434] drop7 <- fc7
- I1231 09:58:25.485344 3750 net.cpp:395] drop7 -> fc7 (in-place)
- I1231 09:58:25.485357 3750 net.cpp:150] Setting up drop7
- I1231 09:58:25.485361 3750 net.cpp:157] Top shape: 64 4096 (262144)
- I1231 09:58:25.485363 3750 net.cpp:165] Memory required for data: 305434880
- I1231 09:58:25.485364 3750 layer_factory.hpp:77] Creating layer fc8
- I1231 09:58:25.485368 3750 net.cpp:100] Creating Layer fc8
- I1231 09:58:25.485370 3750 net.cpp:434] fc8 <- fc7
- I1231 09:58:25.485374 3750 net.cpp:408] fc8 -> fc8
- I1231 09:58:25.516559 3750 net.cpp:150] Setting up fc8
- I1231 09:58:25.516577 3750 net.cpp:157] Top shape: 64 1000 (64000)
- I1231 09:58:25.516598 3750 net.cpp:165] Memory required for data: 305690880
- I1231 09:58:25.516607 3750 layer_factory.hpp:77] Creating layer loss
- I1231 09:58:25.516614 3750 net.cpp:100] Creating Layer loss
- I1231 09:58:25.516618 3750 net.cpp:434] loss <- fc8
- I1231 09:58:25.516623 3750 net.cpp:434] loss <- label
- I1231 09:58:25.516628 3750 net.cpp:408] loss -> loss
- I1231 09:58:25.516643 3750 layer_factory.hpp:77] Creating layer loss
- I1231 09:58:25.516732 3750 net.cpp:150] Setting up loss
- I1231 09:58:25.516736 3750 net.cpp:157] Top shape: (1)
- I1231 09:58:25.516738 3750 net.cpp:160] with loss weight 1
- I1231 09:58:25.516757 3750 net.cpp:165] Memory required for data: 305690884
- I1231 09:58:25.516759 3750 net.cpp:226] loss needs backward computation.
- I1231 09:58:25.516765 3750 net.cpp:226] fc8 needs backward computation.
- I1231 09:58:25.516768 3750 net.cpp:226] drop7 needs backward computation.
- I1231 09:58:25.516770 3750 net.cpp:226] relu7 needs backward computation.
- I1231 09:58:25.516772 3750 net.cpp:226] fc7 needs backward computation.
- I1231 09:58:25.516775 3750 net.cpp:226] drop6 needs backward computation.
- I1231 09:58:25.516778 3750 net.cpp:226] relu6 needs backward computation.
- I1231 09:58:25.516782 3750 net.cpp:226] fc6 needs backward computation.
- I1231 09:58:25.516784 3750 net.cpp:226] pool5 needs backward computation.
- I1231 09:58:25.516788 3750 net.cpp:226] relu5 needs backward computation.
- I1231 09:58:25.516790 3750 net.cpp:226] conv5 needs backward computation.
- I1231 09:58:25.516793 3750 net.cpp:226] relu4 needs backward computation.
- I1231 09:58:25.516795 3750 net.cpp:226] conv4 needs backward computation.
- I1231 09:58:25.516798 3750 net.cpp:226] relu3 needs backward computation.
- I1231 09:58:25.516801 3750 net.cpp:226] conv3 needs backward computation.
- I1231 09:58:25.516804 3750 net.cpp:226] pool2 needs backward computation.
- I1231 09:58:25.516806 3750 net.cpp:226] norm2 needs backward computation.
- I1231 09:58:25.516809 3750 net.cpp:226] relu2 needs backward computation.
- I1231 09:58:25.516813 3750 net.cpp:226] conv2 needs backward computation.
- I1231 09:58:25.516814 3750 net.cpp:226] pool1 needs backward computation.
- I1231 09:58:25.516816 3750 net.cpp:226] norm1 needs backward computation.
- I1231 09:58:25.516819 3750 net.cpp:226] relu1 needs backward computation.
- I1231 09:58:25.516822 3750 net.cpp:226] conv1 needs backward computation.
- I1231 09:58:25.516825 3750 net.cpp:228] data does not need backward computation.
- I1231 09:58:25.516827 3750 net.cpp:270] This network produces output loss
- I1231 09:58:25.516837 3750 net.cpp:283] Network initialization done.
- I1231 09:58:25.517024 3750 solver.cpp:181] Creating test net (#0) specified by net file: models/bvlc_alexnet/train_valmnist.prototxt
- I1231 09:58:25.517045 3750 net.cpp:322] The NetState phase (1) differed from the phase (0) specified by a rule in layer data
- I1231 09:58:25.517154 3750 net.cpp:58] Initializing net from parameters:
- name: "AlexNet"
- state {
- phase: TEST
- }
- layer {
- name: "data"
- type: "Data"
- top: "data"
- top: "label"
- include {
- phase: TEST
- }
- transform_param {
- scale: 0.00390625
- }
- data_param {
- source: "examples/mnist/mnist_test_lmdb"
- batch_size: 100
- backend: LMDB
- }
- }
- layer {
- name: "conv1"
- type: "Convolution"
- bottom: "data"
- top: "conv1"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 96
- kernel_size: 3
- stride: 1
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0
- }
- }
- }
- layer {
- name: "relu1"
- type: "ReLU"
- bottom: "conv1"
- top: "conv1"
- }
- layer {
- name: "norm1"
- type: "LRN"
- bottom: "conv1"
- top: "norm1"
- lrn_param {
- local_size: 5
- alpha: 0.0001
- beta: 0.75
- }
- }
- layer {
- name: "pool1"
- type: "Pooling"
- bottom: "norm1"
- top: "pool1"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 1
- }
- }
- layer {
- name: "conv2"
- type: "Convolution"
- bottom: "pool1"
- top: "conv2"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 256
- pad: 2
- kernel_size: 3
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu2"
- type: "ReLU"
- bottom: "conv2"
- top: "conv2"
- }
- layer {
- name: "norm2"
- type: "LRN"
- bottom: "conv2"
- top: "norm2"
- lrn_param {
- local_size: 5
- alpha: 0.0001
- beta: 0.75
- }
- }
- layer {
- name: "pool2"
- type: "Pooling"
- bottom: "norm2"
- top: "pool2"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 2
- }
- }
- layer {
- name: "conv3"
- type: "Convolution"
- bottom: "pool2"
- top: "conv3"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 384
- pad: 1
- kernel_size: 3
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0
- }
- }
- }
- layer {
- name: "relu3"
- type: "ReLU"
- bottom: "conv3"
- top: "conv3"
- }
- layer {
- name: "conv4"
- type: "Convolution"
- bottom: "conv3"
- top: "conv4"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 384
- pad: 1
- kernel_size: 3
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu4"
- type: "ReLU"
- bottom: "conv4"
- top: "conv4"
- }
- layer {
- name: "conv5"
- type: "Convolution"
- bottom: "conv4"
- top: "conv5"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 256
- pad: 1
- kernel_size: 3
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu5"
- type: "ReLU"
- bottom: "conv5"
- top: "conv5"
- }
- layer {
- name: "pool5"
- type: "Pooling"
- bottom: "conv5"
- top: "pool5"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 2
- }
- }
- layer {
- name: "fc6"
- type: "InnerProduct"
- bottom: "pool5"
- top: "fc6"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- inner_product_param {
- num_output: 4096
- weight_filler {
- type: "gaussian"
- std: 0.005
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu6"
- type: "ReLU"
- bottom: "fc6"
- top: "fc6"
- }
- layer {
- name: "drop6"
- type: "Dropout"
- bottom: "fc6"
- top: "fc6"
- dropout_param {
- dropout_ratio: 0.5
- }
- }
- layer {
- name: "fc7"
- type: "InnerProduct"
- bottom: "fc6"
- top: "fc7"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- inner_product_param {
- num_output: 4096
- weight_filler {
- type: "gaussian"
- std: 0.005
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu7"
- type: "ReLU"
- bottom: "fc7"
- top: "fc7"
- }
- layer {
- name: "drop7"
- type: "Dropout"
- bottom: "fc7"
- top: "fc7"
- dropout_param {
- dropout_ratio: 0.5
- }
- }
- layer {
- name: "fc8"
- type: "InnerProduct"
- bottom: "fc7"
- top: "fc8"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- inner_product_param {
- num_output: 1000
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0
- }
- }
- }
- layer {
- name: "accuracy"
- type: "Accuracy"
- bottom: "fc8"
- bottom: "label"
- top: "accuracy"
- include {
- phase: TEST
- }
- }
- layer {
- name: "loss"
- type: "SoftmaxWithLoss"
- bottom: "fc8"
- bottom: "label"
- top: "loss"
- }
- I1231 09:58:25.517351 3750 layer_factory.hpp:77] Creating layer data
- I1231 09:58:25.517536 3750 net.cpp:100] Creating Layer data
- I1231 09:58:25.517542 3750 net.cpp:408] data -> data
- I1231 09:58:25.517549 3750 net.cpp:408] data -> label
- I1231 09:58:25.529608 3762 db_lmdb.cpp:35] Opened lmdb examples/mnist/mnist_test_lmdb
- I1231 09:58:25.530697 3750 data_layer.cpp:41] output data size: 100,1,28,28
- I1231 09:58:25.531313 3750 net.cpp:150] Setting up data
- I1231 09:58:25.531321 3750 net.cpp:157] Top shape: 100 1 28 28 (78400)
- I1231 09:58:25.531325 3750 net.cpp:157] Top shape: 100 (100)
- I1231 09:58:25.531328 3750 net.cpp:165] Memory required for data: 314000
- I1231 09:58:25.531332 3750 layer_factory.hpp:77] Creating layer label_data_1_split
- I1231 09:58:25.531339 3750 net.cpp:100] Creating Layer label_data_1_split
- I1231 09:58:25.531343 3750 net.cpp:434] label_data_1_split <- label
- I1231 09:58:25.531348 3750 net.cpp:408] label_data_1_split -> label_data_1_split_0
- I1231 09:58:25.531354 3750 net.cpp:408] label_data_1_split -> label_data_1_split_1
- I1231 09:58:25.531381 3750 net.cpp:150] Setting up label_data_1_split
- I1231 09:58:25.531385 3750 net.cpp:157] Top shape: 100 (100)
- I1231 09:58:25.531389 3750 net.cpp:157] Top shape: 100 (100)
- I1231 09:58:25.531393 3750 net.cpp:165] Memory required for data: 314800
- I1231 09:58:25.531395 3750 layer_factory.hpp:77] Creating layer conv1
- I1231 09:58:25.531404 3750 net.cpp:100] Creating Layer conv1
- I1231 09:58:25.531406 3750 net.cpp:434] conv1 <- data
- I1231 09:58:25.531411 3750 net.cpp:408] conv1 -> conv1
- I1231 09:58:25.531586 3750 net.cpp:150] Setting up conv1
- I1231 09:58:25.531592 3750 net.cpp:157] Top shape: 100 96 26 26 (6489600)
- I1231 09:58:25.531596 3750 net.cpp:165] Memory required for data: 26273200
- I1231 09:58:25.531604 3750 layer_factory.hpp:77] Creating layer relu1
- I1231 09:58:25.531607 3750 net.cpp:100] Creating Layer relu1
- I1231 09:58:25.531611 3750 net.cpp:434] relu1 <- conv1
- I1231 09:58:25.531615 3750 net.cpp:395] relu1 -> conv1 (in-place)
- I1231 09:58:25.531618 3750 net.cpp:150] Setting up relu1
- I1231 09:58:25.531622 3750 net.cpp:157] Top shape: 100 96 26 26 (6489600)
- I1231 09:58:25.531625 3750 net.cpp:165] Memory required for data: 52231600
- I1231 09:58:25.531628 3750 layer_factory.hpp:77] Creating layer norm1
- I1231 09:58:25.531635 3750 net.cpp:100] Creating Layer norm1
- I1231 09:58:25.531636 3750 net.cpp:434] norm1 <- conv1
- I1231 09:58:25.531641 3750 net.cpp:408] norm1 -> norm1
- I1231 09:58:25.531661 3750 net.cpp:150] Setting up norm1
- I1231 09:58:25.531666 3750 net.cpp:157] Top shape: 100 96 26 26 (6489600)
- I1231 09:58:25.531668 3750 net.cpp:165] Memory required for data: 78190000
- I1231 09:58:25.531672 3750 layer_factory.hpp:77] Creating layer pool1
- I1231 09:58:25.531677 3750 net.cpp:100] Creating Layer pool1
- I1231 09:58:25.531679 3750 net.cpp:434] pool1 <- norm1
- I1231 09:58:25.531683 3750 net.cpp:408] pool1 -> pool1
- I1231 09:58:25.531702 3750 net.cpp:150] Setting up pool1
- I1231 09:58:25.531707 3750 net.cpp:157] Top shape: 100 96 24 24 (5529600)
- I1231 09:58:25.531709 3750 net.cpp:165] Memory required for data: 100308400
- I1231 09:58:25.531713 3750 layer_factory.hpp:77] Creating layer conv2
- I1231 09:58:25.531719 3750 net.cpp:100] Creating Layer conv2
- I1231 09:58:25.531723 3750 net.cpp:434] conv2 <- pool1
- I1231 09:58:25.531728 3750 net.cpp:408] conv2 -> conv2
- I1231 09:58:25.532605 3750 net.cpp:150] Setting up conv2
- I1231 09:58:25.532611 3750 net.cpp:157] Top shape: 100 256 26 26 (17305600)
- I1231 09:58:25.532614 3750 net.cpp:165] Memory required for data: 169530800
- I1231 09:58:25.532621 3750 layer_factory.hpp:77] Creating layer relu2
- I1231 09:58:25.532626 3750 net.cpp:100] Creating Layer relu2
- I1231 09:58:25.532630 3750 net.cpp:434] relu2 <- conv2
- I1231 09:58:25.532635 3750 net.cpp:395] relu2 -> conv2 (in-place)
- I1231 09:58:25.532639 3750 net.cpp:150] Setting up relu2
- I1231 09:58:25.532645 3750 net.cpp:157] Top shape: 100 256 26 26 (17305600)
- I1231 09:58:25.532649 3750 net.cpp:165] Memory required for data: 238753200
- I1231 09:58:25.532652 3750 layer_factory.hpp:77] Creating layer norm2
- I1231 09:58:25.532658 3750 net.cpp:100] Creating Layer norm2
- I1231 09:58:25.532661 3750 net.cpp:434] norm2 <- conv2
- I1231 09:58:25.532667 3750 net.cpp:408] norm2 -> norm2
- I1231 09:58:25.532724 3750 net.cpp:150] Setting up norm2
- I1231 09:58:25.532730 3750 net.cpp:157] Top shape: 100 256 26 26 (17305600)
- I1231 09:58:25.532733 3750 net.cpp:165] Memory required for data: 307975600
- I1231 09:58:25.532737 3750 layer_factory.hpp:77] Creating layer pool2
- I1231 09:58:25.532742 3750 net.cpp:100] Creating Layer pool2
- I1231 09:58:25.532747 3750 net.cpp:434] pool2 <- norm2
- I1231 09:58:25.532752 3750 net.cpp:408] pool2 -> pool2
- I1231 09:58:25.532773 3750 net.cpp:150] Setting up pool2
- I1231 09:58:25.532778 3750 net.cpp:157] Top shape: 100 256 13 13 (4326400)
- I1231 09:58:25.532781 3750 net.cpp:165] Memory required for data: 325281200
- I1231 09:58:25.532785 3750 layer_factory.hpp:77] Creating layer conv3
- I1231 09:58:25.532793 3750 net.cpp:100] Creating Layer conv3
- I1231 09:58:25.532795 3750 net.cpp:434] conv3 <- pool2
- I1231 09:58:25.532801 3750 net.cpp:408] conv3 -> conv3
- I1231 09:58:25.539577 3750 net.cpp:150] Setting up conv3
- I1231 09:58:25.539587 3750 net.cpp:157] Top shape: 100 384 13 13 (6489600)
- I1231 09:58:25.539592 3750 net.cpp:165] Memory required for data: 351239600
- I1231 09:58:25.539599 3750 layer_factory.hpp:77] Creating layer relu3
- I1231 09:58:25.539604 3750 net.cpp:100] Creating Layer relu3
- I1231 09:58:25.539608 3750 net.cpp:434] relu3 <- conv3
- I1231 09:58:25.539613 3750 net.cpp:395] relu3 -> conv3 (in-place)
- I1231 09:58:25.539621 3750 net.cpp:150] Setting up relu3
- I1231 09:58:25.539625 3750 net.cpp:157] Top shape: 100 384 13 13 (6489600)
- I1231 09:58:25.539629 3750 net.cpp:165] Memory required for data: 377198000
- I1231 09:58:25.539638 3750 layer_factory.hpp:77] Creating layer conv4
- I1231 09:58:25.539649 3750 net.cpp:100] Creating Layer conv4
- I1231 09:58:25.539652 3750 net.cpp:434] conv4 <- conv3
- I1231 09:58:25.539659 3750 net.cpp:408] conv4 -> conv4
- I1231 09:58:25.544891 3750 net.cpp:150] Setting up conv4
- I1231 09:58:25.544909 3750 net.cpp:157] Top shape: 100 384 13 13 (6489600)
- I1231 09:58:25.544912 3750 net.cpp:165] Memory required for data: 403156400
- I1231 09:58:25.544921 3750 layer_factory.hpp:77] Creating layer relu4
- I1231 09:58:25.544929 3750 net.cpp:100] Creating Layer relu4
- I1231 09:58:25.544931 3750 net.cpp:434] relu4 <- conv4
- I1231 09:58:25.544937 3750 net.cpp:395] relu4 -> conv4 (in-place)
- I1231 09:58:25.544944 3750 net.cpp:150] Setting up relu4
- I1231 09:58:25.544947 3750 net.cpp:157] Top shape: 100 384 13 13 (6489600)
- I1231 09:58:25.544950 3750 net.cpp:165] Memory required for data: 429114800
- I1231 09:58:25.544955 3750 layer_factory.hpp:77] Creating layer conv5
- I1231 09:58:25.544962 3750 net.cpp:100] Creating Layer conv5
- I1231 09:58:25.544965 3750 net.cpp:434] conv5 <- conv4
- I1231 09:58:25.544970 3750 net.cpp:408] conv5 -> conv5
- I1231 09:58:25.548626 3750 net.cpp:150] Setting up conv5
- I1231 09:58:25.548636 3750 net.cpp:157] Top shape: 100 256 13 13 (4326400)
- I1231 09:58:25.548640 3750 net.cpp:165] Memory required for data: 446420400
- I1231 09:58:25.548648 3750 layer_factory.hpp:77] Creating layer relu5
- I1231 09:58:25.548653 3750 net.cpp:100] Creating Layer relu5
- I1231 09:58:25.548656 3750 net.cpp:434] relu5 <- conv5
- I1231 09:58:25.548661 3750 net.cpp:395] relu5 -> conv5 (in-place)
- I1231 09:58:25.548666 3750 net.cpp:150] Setting up relu5
- I1231 09:58:25.548669 3750 net.cpp:157] Top shape: 100 256 13 13 (4326400)
- I1231 09:58:25.548673 3750 net.cpp:165] Memory required for data: 463726000
- I1231 09:58:25.548676 3750 layer_factory.hpp:77] Creating layer pool5
- I1231 09:58:25.548683 3750 net.cpp:100] Creating Layer pool5
- I1231 09:58:25.548686 3750 net.cpp:434] pool5 <- conv5
- I1231 09:58:25.548691 3750 net.cpp:408] pool5 -> pool5
- I1231 09:58:25.548715 3750 net.cpp:150] Setting up pool5
- I1231 09:58:25.548720 3750 net.cpp:157] Top shape: 100 256 6 6 (921600)
- I1231 09:58:25.548722 3750 net.cpp:165] Memory required for data: 467412400
- I1231 09:58:25.548727 3750 layer_factory.hpp:77] Creating layer fc6
- I1231 09:58:25.548732 3750 net.cpp:100] Creating Layer fc6
- I1231 09:58:25.548735 3750 net.cpp:434] fc6 <- pool5
- I1231 09:58:25.548739 3750 net.cpp:408] fc6 -> fc6
- I1231 09:58:25.825659 3750 net.cpp:150] Setting up fc6
- I1231 09:58:25.825680 3750 net.cpp:157] Top shape: 100 4096 (409600)
- I1231 09:58:25.825685 3750 net.cpp:165] Memory required for data: 469050800
- I1231 09:58:25.825693 3750 layer_factory.hpp:77] Creating layer relu6
- I1231 09:58:25.825701 3750 net.cpp:100] Creating Layer relu6
- I1231 09:58:25.825706 3750 net.cpp:434] relu6 <- fc6
- I1231 09:58:25.825712 3750 net.cpp:395] relu6 -> fc6 (in-place)
- I1231 09:58:25.825719 3750 net.cpp:150] Setting up relu6
- I1231 09:58:25.825724 3750 net.cpp:157] Top shape: 100 4096 (409600)
- I1231 09:58:25.825727 3750 net.cpp:165] Memory required for data: 470689200
- I1231 09:58:25.825731 3750 layer_factory.hpp:77] Creating layer drop6
- I1231 09:58:25.825736 3750 net.cpp:100] Creating Layer drop6
- I1231 09:58:25.825739 3750 net.cpp:434] drop6 <- fc6
- I1231 09:58:25.825744 3750 net.cpp:395] drop6 -> fc6 (in-place)
- I1231 09:58:25.825763 3750 net.cpp:150] Setting up drop6
- I1231 09:58:25.825767 3750 net.cpp:157] Top shape: 100 4096 (409600)
- I1231 09:58:25.825769 3750 net.cpp:165] Memory required for data: 472327600
- I1231 09:58:25.825772 3750 layer_factory.hpp:77] Creating layer fc7
- I1231 09:58:25.825776 3750 net.cpp:100] Creating Layer fc7
- I1231 09:58:25.825778 3750 net.cpp:434] fc7 <- fc6
- I1231 09:58:25.825781 3750 net.cpp:408] fc7 -> fc7
- I1231 09:58:25.950098 3750 net.cpp:150] Setting up fc7
- I1231 09:58:25.950117 3750 net.cpp:157] Top shape: 100 4096 (409600)
- I1231 09:58:25.950121 3750 net.cpp:165] Memory required for data: 473966000
- I1231 09:58:25.950130 3750 layer_factory.hpp:77] Creating layer relu7
- I1231 09:58:25.950139 3750 net.cpp:100] Creating Layer relu7
- I1231 09:58:25.950142 3750 net.cpp:434] relu7 <- fc7
- I1231 09:58:25.950148 3750 net.cpp:395] relu7 -> fc7 (in-place)
- I1231 09:58:25.950155 3750 net.cpp:150] Setting up relu7
- I1231 09:58:25.950160 3750 net.cpp:157] Top shape: 100 4096 (409600)
- I1231 09:58:25.950162 3750 net.cpp:165] Memory required for data: 475604400
- I1231 09:58:25.950166 3750 layer_factory.hpp:77] Creating layer drop7
- I1231 09:58:25.950172 3750 net.cpp:100] Creating Layer drop7
- I1231 09:58:25.950176 3750 net.cpp:434] drop7 <- fc7
- I1231 09:58:25.950179 3750 net.cpp:395] drop7 -> fc7 (in-place)
- I1231 09:58:25.950199 3750 net.cpp:150] Setting up drop7
- I1231 09:58:25.950203 3750 net.cpp:157] Top shape: 100 4096 (409600)
- I1231 09:58:25.950206 3750 net.cpp:165] Memory required for data: 477242800
- I1231 09:58:25.950207 3750 layer_factory.hpp:77] Creating layer fc8
- I1231 09:58:25.950212 3750 net.cpp:100] Creating Layer fc8
- I1231 09:58:25.950213 3750 net.cpp:434] fc8 <- fc7
- I1231 09:58:25.950217 3750 net.cpp:408] fc8 -> fc8
- I1231 09:58:25.980584 3750 net.cpp:150] Setting up fc8
- I1231 09:58:25.980602 3750 net.cpp:157] Top shape: 100 1000 (100000)
- I1231 09:58:25.980604 3750 net.cpp:165] Memory required for data: 477642800
- I1231 09:58:25.980612 3750 layer_factory.hpp:77] Creating layer fc8_fc8_0_split
- I1231 09:58:25.980618 3750 net.cpp:100] Creating Layer fc8_fc8_0_split
- I1231 09:58:25.980620 3750 net.cpp:434] fc8_fc8_0_split <- fc8
- I1231 09:58:25.980625 3750 net.cpp:408] fc8_fc8_0_split -> fc8_fc8_0_split_0
- I1231 09:58:25.980659 3750 net.cpp:408] fc8_fc8_0_split -> fc8_fc8_0_split_1
- I1231 09:58:25.980685 3750 net.cpp:150] Setting up fc8_fc8_0_split
- I1231 09:58:25.980689 3750 net.cpp:157] Top shape: 100 1000 (100000)
- I1231 09:58:25.980691 3750 net.cpp:157] Top shape: 100 1000 (100000)
- I1231 09:58:25.980693 3750 net.cpp:165] Memory required for data: 478442800
- I1231 09:58:25.980695 3750 layer_factory.hpp:77] Creating layer accuracy
- I1231 09:58:25.980700 3750 net.cpp:100] Creating Layer accuracy
- I1231 09:58:25.980701 3750 net.cpp:434] accuracy <- fc8_fc8_0_split_0
- I1231 09:58:25.980705 3750 net.cpp:434] accuracy <- label_data_1_split_0
- I1231 09:58:25.980707 3750 net.cpp:408] accuracy -> accuracy
- I1231 09:58:25.980712 3750 net.cpp:150] Setting up accuracy
- I1231 09:58:25.980715 3750 net.cpp:157] Top shape: (1)
- I1231 09:58:25.980718 3750 net.cpp:165] Memory required for data: 478442804
- I1231 09:58:25.980741 3750 layer_factory.hpp:77] Creating layer loss
- I1231 09:58:25.980746 3750 net.cpp:100] Creating Layer loss
- I1231 09:58:25.980749 3750 net.cpp:434] loss <- fc8_fc8_0_split_1
- I1231 09:58:25.980753 3750 net.cpp:434] loss <- label_data_1_split_1
- I1231 09:58:25.980757 3750 net.cpp:408] loss -> loss
- I1231 09:58:25.980764 3750 layer_factory.hpp:77] Creating layer loss
- I1231 09:58:25.980870 3750 net.cpp:150] Setting up loss
- I1231 09:58:25.980873 3750 net.cpp:157] Top shape: (1)
- I1231 09:58:25.980875 3750 net.cpp:160] with loss weight 1
- I1231 09:58:25.980883 3750 net.cpp:165] Memory required for data: 478442808
- I1231 09:58:25.980885 3750 net.cpp:226] loss needs backward computation.
- I1231 09:58:25.980888 3750 net.cpp:228] accuracy does not need backward computation.
- I1231 09:58:25.980890 3750 net.cpp:226] fc8_fc8_0_split needs backward computation.
- I1231 09:58:25.980892 3750 net.cpp:226] fc8 needs backward computation.
- I1231 09:58:25.980895 3750 net.cpp:226] drop7 needs backward computation.
- I1231 09:58:25.980896 3750 net.cpp:226] relu7 needs backward computation.
- I1231 09:58:25.980898 3750 net.cpp:226] fc7 needs backward computation.
- I1231 09:58:25.980901 3750 net.cpp:226] drop6 needs backward computation.
- I1231 09:58:25.980906 3750 net.cpp:226] relu6 needs backward computation.
- I1231 09:58:25.980908 3750 net.cpp:226] fc6 needs backward computation.
- I1231 09:58:25.980911 3750 net.cpp:226] pool5 needs backward computation.
- I1231 09:58:25.980913 3750 net.cpp:226] relu5 needs backward computation.
- I1231 09:58:25.980917 3750 net.cpp:226] conv5 needs backward computation.
- I1231 09:58:25.980921 3750 net.cpp:226] relu4 needs backward computation.
- I1231 09:58:25.980924 3750 net.cpp:226] conv4 needs backward computation.
- I1231 09:58:25.980927 3750 net.cpp:226] relu3 needs backward computation.
- I1231 09:58:25.980931 3750 net.cpp:226] conv3 needs backward computation.
- I1231 09:58:25.980936 3750 net.cpp:226] pool2 needs backward computation.
- I1231 09:58:25.980939 3750 net.cpp:226] norm2 needs backward computation.
- I1231 09:58:25.980942 3750 net.cpp:226] relu2 needs backward computation.
- I1231 09:58:25.980947 3750 net.cpp:226] conv2 needs backward computation.
- I1231 09:58:25.980949 3750 net.cpp:226] pool1 needs backward computation.
- I1231 09:58:25.980953 3750 net.cpp:226] norm1 needs backward computation.
- I1231 09:58:25.980957 3750 net.cpp:226] relu1 needs backward computation.
- I1231 09:58:25.980960 3750 net.cpp:226] conv1 needs backward computation.
- I1231 09:58:25.980964 3750 net.cpp:228] label_data_1_split does not need backward computation.
- I1231 09:58:25.980968 3750 net.cpp:228] data does not need backward computation.
- I1231 09:58:25.980973 3750 net.cpp:270] This network produces output accuracy
- I1231 09:58:25.980976 3750 net.cpp:270] This network produces output loss
- I1231 09:58:25.980988 3750 net.cpp:283] Network initialization done.
- I1231 09:58:25.981042 3750 solver.cpp:60] Solver scaffolding done.
- I1231 09:58:25.981361 3750 caffe.cpp:251] Starting Optimization
- I1231 09:58:25.981365 3750 solver.cpp:279] Solving AlexNet
- I1231 09:58:25.981369 3750 solver.cpp:280] Learning Rate Policy: step
- I1231 09:58:25.982930 3750 solver.cpp:337] Iteration 0, Testing net (#0)
- I1231 09:59:34.489457 3750 solver.cpp:404] Test net output #0: accuracy = 0
- I1231 09:59:34.489511 3750 solver.cpp:404] Test net output #1: loss = 6.91249 (* 1 = 6.91249 loss)
- I1231 09:59:34.574491 3750 solver.cpp:228] Iteration 0, loss = 6.90698
- I1231 09:59:34.574506 3750 solver.cpp:244] Train net output #0: loss = 6.90698 (* 1 = 6.90698 loss)
- I1231 09:59:34.574520 3750 sgd_solver.cpp:106] Iteration 0, lr = 0.01
- I1231 09:59:37.203480 3750 solver.cpp:228] Iteration 20, loss = 2.87205
- I1231 09:59:37.203502 3750 solver.cpp:244] Train net output #0: loss = 2.87205 (* 1 = 2.87205 loss)
- I1231 09:59:37.203507 3750 sgd_solver.cpp:106] Iteration 20, lr = 0.01
- I1231 09:59:39.827961 3750 solver.cpp:228] Iteration 40, loss = 2.44488
- I1231 09:59:39.827983 3750 solver.cpp:244] Train net output #0: loss = 2.44488 (* 1 = 2.44488 loss)
- I1231 09:59:39.827987 3750 sgd_solver.cpp:106] Iteration 40, lr = 0.01
- I1231 09:59:42.452934 3750 solver.cpp:228] Iteration 60, loss = 2.30728
- I1231 09:59:42.452955 3750 solver.cpp:244] Train net output #0: loss = 2.30728 (* 1 = 2.30728 loss)
- I1231 09:59:42.452960 3750 sgd_solver.cpp:106] Iteration 60, lr = 0.01
- I1231 09:59:45.078403 3750 solver.cpp:228] Iteration 80, loss = 2.37298
- I1231 09:59:45.078423 3750 solver.cpp:244] Train net output #0: loss = 2.37298 (* 1 = 2.37298 loss)
- I1231 09:59:45.078426 3750 sgd_solver.cpp:106] Iteration 80, lr = 0.01
- I1231 09:59:47.703523 3750 solver.cpp:228] Iteration 100, loss = 2.35641
- I1231 09:59:47.703547 3750 solver.cpp:244] Train net output #0: loss = 2.35641 (* 1 = 2.35641 loss)
- I1231 09:59:47.703552 3750 sgd_solver.cpp:106] Iteration 100, lr = 0.01
- I1231 09:59:50.335674 3750 solver.cpp:228] Iteration 120, loss = 2.37395
- I1231 09:59:50.335695 3750 solver.cpp:244] Train net output #0: loss = 2.37395 (* 1 = 2.37395 loss)
- I1231 09:59:50.335700 3750 sgd_solver.cpp:106] Iteration 120, lr = 0.01
- I1231 09:59:52.969204 3750 solver.cpp:228] Iteration 140, loss = 2.40437
- I1231 09:59:52.969228 3750 solver.cpp:244] Train net output #0: loss = 2.40437 (* 1 = 2.40437 loss)
- I1231 09:59:52.969233 3750 sgd_solver.cpp:106] Iteration 140, lr = 0.01
- I1231 09:59:55.607323 3750 solver.cpp:228] Iteration 160, loss = 2.38931
- I1231 09:59:55.607345 3750 solver.cpp:244] Train net output #0: loss = 2.38931 (* 1 = 2.38931 loss)
- I1231 09:59:55.607350 3750 sgd_solver.cpp:106] Iteration 160, lr = 0.01
- I1231 09:59:58.231901 3750 solver.cpp:228] Iteration 180, loss = 2.37691
- I1231 09:59:58.231922 3750 solver.cpp:244] Train net output #0: loss = 2.37691 (* 1 = 2.37691 loss)
- I1231 09:59:58.231925 3750 sgd_solver.cpp:106] Iteration 180, lr = 0.01
- I1231 10:00:00.858958 3750 solver.cpp:228] Iteration 200, loss = 2.30506
- I1231 10:00:00.858978 3750 solver.cpp:244] Train net output #0: loss = 2.30506 (* 1 = 2.30506 loss)
- I1231 10:00:00.858984 3750 sgd_solver.cpp:106] Iteration 200, lr = 0.01
- I1231 10:00:03.483427 3750 solver.cpp:228] Iteration 220, loss = 2.31115
- I1231 10:00:03.483449 3750 solver.cpp:244] Train net output #0: loss = 2.31115 (* 1 = 2.31115 loss)
- I1231 10:00:03.483453 3750 sgd_solver.cpp:106] Iteration 220, lr = 0.01
- I1231 10:00:06.110460 3750 solver.cpp:228] Iteration 240, loss = 2.30202
- I1231 10:00:06.110561 3750 solver.cpp:244] Train net output #0: loss = 2.30202 (* 1 = 2.30202 loss)
- I1231 10:00:06.110566 3750 sgd_solver.cpp:106] Iteration 240, lr = 0.01
- I1231 10:00:08.747812 3750 solver.cpp:228] Iteration 260, loss = 2.31446
- I1231 10:00:08.747835 3750 solver.cpp:244] Train net output #0: loss = 2.31446 (* 1 = 2.31446 loss)
- I1231 10:00:08.747841 3750 sgd_solver.cpp:106] Iteration 260, lr = 0.01
- I1231 10:00:11.374794 3750 solver.cpp:228] Iteration 280, loss = 2.31768
- I1231 10:00:11.374815 3750 solver.cpp:244] Train net output #0: loss = 2.31768 (* 1 = 2.31768 loss)
- I1231 10:00:11.374820 3750 sgd_solver.cpp:106] Iteration 280, lr = 0.01
- I1231 10:00:14.000993 3750 solver.cpp:228] Iteration 300, loss = 2.35765
- I1231 10:00:14.001019 3750 solver.cpp:244] Train net output #0: loss = 2.35765 (* 1 = 2.35765 loss)
- I1231 10:00:14.001024 3750 sgd_solver.cpp:106] Iteration 300, lr = 0.01
- I1231 10:00:16.629060 3750 solver.cpp:228] Iteration 320, loss = 2.40547
- I1231 10:00:16.629081 3750 solver.cpp:244] Train net output #0: loss = 2.40547 (* 1 = 2.40547 loss)
- I1231 10:00:16.629086 3750 sgd_solver.cpp:106] Iteration 320, lr = 0.01
- I1231 10:00:19.260450 3750 solver.cpp:228] Iteration 340, loss = 2.35476
- I1231 10:00:19.260473 3750 solver.cpp:244] Train net output #0: loss = 2.35476 (* 1 = 2.35476 loss)
- I1231 10:00:19.260478 3750 sgd_solver.cpp:106] Iteration 340, lr = 0.01
- I1231 10:00:21.892527 3750 solver.cpp:228] Iteration 360, loss = 2.38624
- I1231 10:00:21.892549 3750 solver.cpp:244] Train net output #0: loss = 2.38624 (* 1 = 2.38624 loss)
- I1231 10:00:21.892554 3750 sgd_solver.cpp:106] Iteration 360, lr = 0.01
- I1231 10:00:24.523481 3750 solver.cpp:228] Iteration 380, loss = 2.28768
- I1231 10:00:24.523502 3750 solver.cpp:244] Train net output #0: loss = 2.28768 (* 1 = 2.28768 loss)
- I1231 10:00:24.523507 3750 sgd_solver.cpp:106] Iteration 380, lr = 0.01
- I1231 10:00:27.156440 3750 solver.cpp:228] Iteration 400, loss = 2.28866
- I1231 10:00:27.156461 3750 solver.cpp:244] Train net output #0: loss = 2.28866 (* 1 = 2.28866 loss)
- I1231 10:00:27.156466 3750 sgd_solver.cpp:106] Iteration 400, lr = 0.01
- I1231 10:00:29.784386 3750 solver.cpp:228] Iteration 420, loss = 2.34239
- I1231 10:00:29.784409 3750 solver.cpp:244] Train net output #0: loss = 2.34239 (* 1 = 2.34239 loss)
- I1231 10:00:29.784415 3750 sgd_solver.cpp:106] Iteration 420, lr = 0.01
- I1231 10:00:32.433159 3750 solver.cpp:228] Iteration 440, loss = 2.31338
- I1231 10:00:32.433181 3750 solver.cpp:244] Train net output #0: loss = 2.31338 (* 1 = 2.31338 loss)
- I1231 10:00:32.433185 3750 sgd_solver.cpp:106] Iteration 440, lr = 0.01
- I1231 10:00:35.096091 3750 solver.cpp:228] Iteration 460, loss = 2.33569
- I1231 10:00:35.096112 3750 solver.cpp:244] Train net output #0: loss = 2.33569 (* 1 = 2.33569 loss)
- I1231 10:00:35.096115 3750 sgd_solver.cpp:106] Iteration 460, lr = 0.01
- I1231 10:00:37.747421 3750 solver.cpp:228] Iteration 480, loss = 2.31552
- I1231 10:00:37.747535 3750 solver.cpp:244] Train net output #0: loss = 2.31552 (* 1 = 2.31552 loss)
- I1231 10:00:37.747541 3750 sgd_solver.cpp:106] Iteration 480, lr = 0.01
- I1231 10:00:40.380364 3750 solver.cpp:228] Iteration 500, loss = 2.34981
- I1231 10:00:40.380388 3750 solver.cpp:244] Train net output #0: loss = 2.34981 (* 1 = 2.34981 loss)
- I1231 10:00:40.380393 3750 sgd_solver.cpp:106] Iteration 500, lr = 0.01
- I1231 10:00:43.006512 3750 solver.cpp:228] Iteration 520, loss = 2.34311
- I1231 10:00:43.006536 3750 solver.cpp:244] Train net output #0: loss = 2.34311 (* 1 = 2.34311 loss)
- I1231 10:00:43.006541 3750 sgd_solver.cpp:106] Iteration 520, lr = 0.01
- I1231 10:00:45.633585 3750 solver.cpp:228] Iteration 540, loss = 2.35487
- I1231 10:00:45.633607 3750 solver.cpp:244] Train net output #0: loss = 2.35487 (* 1 = 2.35487 loss)
- I1231 10:00:45.633612 3750 sgd_solver.cpp:106] Iteration 540, lr = 0.01
- I1231 10:00:48.272758 3750 solver.cpp:228] Iteration 560, loss = 2.32368
- I1231 10:00:48.272781 3750 solver.cpp:244] Train net output #0: loss = 2.32368 (* 1 = 2.32368 loss)
- I1231 10:00:48.272785 3750 sgd_solver.cpp:106] Iteration 560, lr = 0.01
- I1231 10:00:50.903023 3750 solver.cpp:228] Iteration 580, loss = 2.35437
- I1231 10:00:50.903046 3750 solver.cpp:244] Train net output #0: loss = 2.35437 (* 1 = 2.35437 loss)
- I1231 10:00:50.903051 3750 sgd_solver.cpp:106] Iteration 580, lr = 0.01
- I1231 10:00:53.531841 3750 solver.cpp:228] Iteration 600, loss = 2.3758
- I1231 10:00:53.531873 3750 solver.cpp:244] Train net output #0: loss = 2.3758 (* 1 = 2.3758 loss)
- I1231 10:00:53.531880 3750 sgd_solver.cpp:106] Iteration 600, lr = 0.01
- I1231 10:00:56.163364 3750 solver.cpp:228] Iteration 620, loss = 2.33631
- I1231 10:00:56.163386 3750 solver.cpp:244] Train net output #0: loss = 2.33631 (* 1 = 2.33631 loss)
- I1231 10:00:56.163391 3750 sgd_solver.cpp:106] Iteration 620, lr = 0.01
- I1231 10:00:58.801231 3750 solver.cpp:228] Iteration 640, loss = 2.35923
- I1231 10:00:58.801256 3750 solver.cpp:244] Train net output #0: loss = 2.35923 (* 1 = 2.35923 loss)
- I1231 10:00:58.801261 3750 sgd_solver.cpp:106] Iteration 640, lr = 0.01
- I1231 10:01:01.437595 3750 solver.cpp:228] Iteration 660, loss = 2.35392
- I1231 10:01:01.437618 3750 solver.cpp:244] Train net output #0: loss = 2.35392 (* 1 = 2.35392 loss)
- I1231 10:01:01.437621 3750 sgd_solver.cpp:106] Iteration 660, lr = 0.01
- I1231 10:01:04.066489 3750 solver.cpp:228] Iteration 680, loss = 2.35177
- I1231 10:01:04.066514 3750 solver.cpp:244] Train net output #0: loss = 2.35177 (* 1 = 2.35177 loss)
- I1231 10:01:04.066519 3750 sgd_solver.cpp:106] Iteration 680, lr = 0.01
- I1231 10:01:06.700254 3750 solver.cpp:228] Iteration 700, loss = 2.38331
- I1231 10:01:06.700275 3750 solver.cpp:244] Train net output #0: loss = 2.38331 (* 1 = 2.38331 loss)
- I1231 10:01:06.700281 3750 sgd_solver.cpp:106] Iteration 700, lr = 0.01
- I1231 10:01:09.336349 3750 solver.cpp:228] Iteration 720, loss = 2.27814
- I1231 10:01:09.336441 3750 solver.cpp:244] Train net output #0: loss = 2.27814 (* 1 = 2.27814 loss)
- I1231 10:01:09.336447 3750 sgd_solver.cpp:106] Iteration 720, lr = 0.01
- I1231 10:01:11.968747 3750 solver.cpp:228] Iteration 740, loss = 2.33237
- I1231 10:01:11.968767 3750 solver.cpp:244] Train net output #0: loss = 2.33237 (* 1 = 2.33237 loss)
- I1231 10:01:11.968770 3750 sgd_solver.cpp:106] Iteration 740, lr = 0.01
- I1231 10:01:14.601928 3750 solver.cpp:228] Iteration 760, loss = 2.32255
- I1231 10:01:14.601950 3750 solver.cpp:244] Train net output #0: loss = 2.32255 (* 1 = 2.32255 loss)
- I1231 10:01:14.601955 3750 sgd_solver.cpp:106] Iteration 760, lr = 0.01
- I1231 10:01:17.236897 3750 solver.cpp:228] Iteration 780, loss = 2.39546
- I1231 10:01:17.236919 3750 solver.cpp:244] Train net output #0: loss = 2.39546 (* 1 = 2.39546 loss)
- I1231 10:01:17.236924 3750 sgd_solver.cpp:106] Iteration 780, lr = 0.01
- I1231 10:01:19.873028 3750 solver.cpp:228] Iteration 800, loss = 2.34442
- I1231 10:01:19.873051 3750 solver.cpp:244] Train net output #0: loss = 2.34442 (* 1 = 2.34442 loss)
- I1231 10:01:19.873056 3750 sgd_solver.cpp:106] Iteration 800, lr = 0.01
- I1231 10:01:22.513250 3750 solver.cpp:228] Iteration 820, loss = 2.35183
- I1231 10:01:22.513273 3750 solver.cpp:244] Train net output #0: loss = 2.35183 (* 1 = 2.35183 loss)
- I1231 10:01:22.513278 3750 sgd_solver.cpp:106] Iteration 820, lr = 0.01
- I1231 10:01:25.159651 3750 solver.cpp:228] Iteration 840, loss = 2.31314
- I1231 10:01:25.159672 3750 solver.cpp:244] Train net output #0: loss = 2.31314 (* 1 = 2.31314 loss)
- I1231 10:01:25.159677 3750 sgd_solver.cpp:106] Iteration 840, lr = 0.01
- I1231 10:01:27.799268 3750 solver.cpp:228] Iteration 860, loss = 2.34888
- I1231 10:01:27.799293 3750 solver.cpp:244] Train net output #0: loss = 2.34888 (* 1 = 2.34888 loss)
- I1231 10:01:27.799298 3750 sgd_solver.cpp:106] Iteration 860, lr = 0.01
- I1231 10:01:30.440199 3750 solver.cpp:228] Iteration 880, loss = 2.35703
- I1231 10:01:30.440222 3750 solver.cpp:244] Train net output #0: loss = 2.35703 (* 1 = 2.35703 loss)
- I1231 10:01:30.440224 3750 sgd_solver.cpp:106] Iteration 880, lr = 0.01
- I1231 10:01:33.077867 3750 solver.cpp:228] Iteration 900, loss = 2.27924
- I1231 10:01:33.077890 3750 solver.cpp:244] Train net output #0: loss = 2.27924 (* 1 = 2.27924 loss)
- I1231 10:01:33.077895 3750 sgd_solver.cpp:106] Iteration 900, lr = 0.01
- I1231 10:01:35.712937 3750 solver.cpp:228] Iteration 920, loss = 2.32514
- I1231 10:01:35.712960 3750 solver.cpp:244] Train net output #0: loss = 2.32514 (* 1 = 2.32514 loss)
- I1231 10:01:35.712965 3750 sgd_solver.cpp:106] Iteration 920, lr = 0.01
- I1231 10:01:38.350399 3750 solver.cpp:228] Iteration 940, loss = 2.29946
- I1231 10:01:38.350424 3750 solver.cpp:244] Train net output #0: loss = 2.29946 (* 1 = 2.29946 loss)
- I1231 10:01:38.350427 3750 sgd_solver.cpp:106] Iteration 940, lr = 0.01
- I1231 10:01:40.988335 3750 solver.cpp:228] Iteration 960, loss = 2.34165
- I1231 10:01:40.988406 3750 solver.cpp:244] Train net output #0: loss = 2.34165 (* 1 = 2.34165 loss)
- I1231 10:01:40.988411 3750 sgd_solver.cpp:106] Iteration 960, lr = 0.01
- I1231 10:01:43.627944 3750 solver.cpp:228] Iteration 980, loss = 2.33231
- I1231 10:01:43.627971 3750 solver.cpp:244] Train net output #0: loss = 2.33231 (* 1 = 2.33231 loss)
- I1231 10:01:43.627979 3750 sgd_solver.cpp:106] Iteration 980, lr = 0.01
- I1231 10:01:46.134196 3750 solver.cpp:454] Snapshotting to binary proto file models/bvlc_alexnet/caffe_alexnet_train_iter_1000.caffemodel
- I1231 10:01:47.144084 3750 sgd_solver.cpp:273] Snapshotting solver state to binary proto file models/bvlc_alexnet/caffe_alexnet_train_iter_1000.solverstate
- I1231 10:01:47.574143 3750 solver.cpp:337] Iteration 1000, Testing net (#0)
- I1231 10:02:56.518460 3750 solver.cpp:404] Test net output #0: accuracy = 0.1028
- I1231 10:02:56.518558 3750 solver.cpp:404] Test net output #1: loss = 2.31395 (* 1 = 2.31395 loss)
- I1231 10:02:56.601646 3750 solver.cpp:228] Iteration 1000, loss = 2.34483
- I1231 10:02:56.601668 3750 solver.cpp:244] Train net output #0: loss = 2.34483 (* 1 = 2.34483 loss)
- I1231 10:02:56.601673 3750 sgd_solver.cpp:106] Iteration 1000, lr = 0.001
- I1231 10:02:59.287474 3750 solver.cpp:228] Iteration 1020, loss = 2.37257
- I1231 10:02:59.287495 3750 solver.cpp:244] Train net output #0: loss = 2.37257 (* 1 = 2.37257 loss)
- I1231 10:02:59.287499 3750 sgd_solver.cpp:106] Iteration 1020, lr = 0.001
- I1231 10:03:01.971499 3750 solver.cpp:228] Iteration 1040, loss = 2.3016
- I1231 10:03:01.971521 3750 solver.cpp:244] Train net output #0: loss = 2.3016 (* 1 = 2.3016 loss)
- I1231 10:03:01.971525 3750 sgd_solver.cpp:106] Iteration 1040, lr = 0.001
- I1231 10:03:04.606308 3750 solver.cpp:228] Iteration 1060, loss = 2.354
- I1231 10:03:04.606333 3750 solver.cpp:244] Train net output #0: loss = 2.354 (* 1 = 2.354 loss)
- I1231 10:03:04.606336 3750 sgd_solver.cpp:106] Iteration 1060, lr = 0.001
- I1231 10:03:07.306514 3750 solver.cpp:228] Iteration 1080, loss = 2.34248
- I1231 10:03:07.306541 3750 solver.cpp:244] Train net output #0: loss = 2.34248 (* 1 = 2.34248 loss)
- I1231 10:03:07.306547 3750 sgd_solver.cpp:106] Iteration 1080, lr = 0.001
- I1231 10:03:10.036783 3750 solver.cpp:228] Iteration 1100, loss = 2.30048
- I1231 10:03:10.036805 3750 solver.cpp:244] Train net output #0: loss = 2.30048 (* 1 = 2.30048 loss)
- I1231 10:03:10.036809 3750 sgd_solver.cpp:106] Iteration 1100, lr = 0.001
- I1231 10:03:12.784339 3750 solver.cpp:228] Iteration 1120, loss = 2.34276
- I1231 10:03:12.784364 3750 solver.cpp:244] Train net output #0: loss = 2.34276 (* 1 = 2.34276 loss)
- I1231 10:03:12.784370 3750 sgd_solver.cpp:106] Iteration 1120, lr = 0.001
- I1231 10:03:15.536545 3750 solver.cpp:228] Iteration 1140, loss = 2.31621
- I1231 10:03:15.536567 3750 solver.cpp:244] Train net output #0: loss = 2.31621 (* 1 = 2.31621 loss)
- I1231 10:03:15.536572 3750 sgd_solver.cpp:106] Iteration 1140, lr = 0.001
- I1231 10:03:18.267773 3750 solver.cpp:228] Iteration 1160, loss = 2.31255
- I1231 10:03:18.267798 3750 solver.cpp:244] Train net output #0: loss = 2.31255 (* 1 = 2.31255 loss)
- I1231 10:03:18.267802 3750 sgd_solver.cpp:106] Iteration 1160, lr = 0.001
- I1231 10:03:21.012823 3750 solver.cpp:228] Iteration 1180, loss = 2.30417
- I1231 10:03:21.012845 3750 solver.cpp:244] Train net output #0: loss = 2.30417 (* 1 = 2.30417 loss)
- I1231 10:03:21.012851 3750 sgd_solver.cpp:106] Iteration 1180, lr = 0.001
- I1231 10:03:23.759482 3750 solver.cpp:228] Iteration 1200, loss = 2.36573
- I1231 10:03:23.759505 3750 solver.cpp:244] Train net output #0: loss = 2.36573 (* 1 = 2.36573 loss)
- I1231 10:03:23.759510 3750 sgd_solver.cpp:106] Iteration 1200, lr = 0.001
- I1231 10:03:26.480738 3750 solver.cpp:228] Iteration 1220, loss = 2.33764
- I1231 10:03:26.480762 3750 solver.cpp:244] Train net output #0: loss = 2.33764 (* 1 = 2.33764 loss)
- I1231 10:03:26.480765 3750 sgd_solver.cpp:106] Iteration 1220, lr = 0.001
- I1231 10:03:29.201123 3750 solver.cpp:228] Iteration 1240, loss = 2.30219
- I1231 10:03:29.201222 3750 solver.cpp:244] Train net output #0: loss = 2.30219 (* 1 = 2.30219 loss)
- I1231 10:03:29.201227 3750 sgd_solver.cpp:106] Iteration 1240, lr = 0.001
- I1231 10:03:31.956243 3750 solver.cpp:228] Iteration 1260, loss = 2.28197
- I1231 10:03:31.956265 3750 solver.cpp:244] Train net output #0: loss = 2.28197 (* 1 = 2.28197 loss)
- I1231 10:03:31.956269 3750 sgd_solver.cpp:106] Iteration 1260, lr = 0.001
- I1231 10:03:34.732066 3750 solver.cpp:228] Iteration 1280, loss = 2.28547
- I1231 10:03:34.732089 3750 solver.cpp:244] Train net output #0: loss = 2.28547 (* 1 = 2.28547 loss)
- I1231 10:03:34.732092 3750 sgd_solver.cpp:106] Iteration 1280, lr = 0.001
- I1231 10:03:37.533429 3750 solver.cpp:228] Iteration 1300, loss = 2.30908
- I1231 10:03:37.533452 3750 solver.cpp:244] Train net output #0: loss = 2.30908 (* 1 = 2.30908 loss)
- I1231 10:03:37.533457 3750 sgd_solver.cpp:106] Iteration 1300, lr = 0.001
- I1231 10:03:40.328743 3750 solver.cpp:228] Iteration 1320, loss = 2.3286
- I1231 10:03:40.328765 3750 solver.cpp:244] Train net output #0: loss = 2.3286 (* 1 = 2.3286 loss)
- I1231 10:03:40.328770 3750 sgd_solver.cpp:106] Iteration 1320, lr = 0.001
- I1231 10:03:43.134333 3750 solver.cpp:228] Iteration 1340, loss = 2.30516
- I1231 10:03:43.134354 3750 solver.cpp:244] Train net output #0: loss = 2.30516 (* 1 = 2.30516 loss)
- I1231 10:03:43.134359 3750 sgd_solver.cpp:106] Iteration 1340, lr = 0.001
- I1231 10:03:45.942070 3750 solver.cpp:228] Iteration 1360, loss = 2.3061
- I1231 10:03:45.942091 3750 solver.cpp:244] Train net output #0: loss = 2.3061 (* 1 = 2.3061 loss)
- I1231 10:03:45.942096 3750 sgd_solver.cpp:106] Iteration 1360, lr = 0.001
- I1231 10:03:48.709724 3750 solver.cpp:228] Iteration 1380, loss = 2.28462
- I1231 10:03:48.709746 3750 solver.cpp:244] Train net output #0: loss = 2.28462 (* 1 = 2.28462 loss)
- I1231 10:03:48.709750 3750 sgd_solver.cpp:106] Iteration 1380, lr = 0.001
- I1231 10:03:51.422637 3750 solver.cpp:228] Iteration 1400, loss = 2.35386
- I1231 10:03:51.422660 3750 solver.cpp:244] Train net output #0: loss = 2.35386 (* 1 = 2.35386 loss)
- I1231 10:03:51.422664 3750 sgd_solver.cpp:106] Iteration 1400, lr = 0.001
- I1231 10:03:54.144119 3750 solver.cpp:228] Iteration 1420, loss = 2.31562
- I1231 10:03:54.144141 3750 solver.cpp:244] Train net output #0: loss = 2.31562 (* 1 = 2.31562 loss)
- I1231 10:03:54.144145 3750 sgd_solver.cpp:106] Iteration 1420, lr = 0.001
- I1231 10:03:56.881331 3750 solver.cpp:228] Iteration 1440, loss = 2.33559
- I1231 10:03:56.881353 3750 solver.cpp:244] Train net output #0: loss = 2.33559 (* 1 = 2.33559 loss)
- I1231 10:03:56.881358 3750 sgd_solver.cpp:106] Iteration 1440, lr = 0.001
- I1231 10:03:59.589541 3750 solver.cpp:228] Iteration 1460, loss = 2.28929
- I1231 10:03:59.589658 3750 solver.cpp:244] Train net output #0: loss = 2.28929 (* 1 = 2.28929 loss)
- I1231 10:03:59.589663 3750 sgd_solver.cpp:106] Iteration 1460, lr = 0.001
- I1231 10:04:02.302563 3750 solver.cpp:228] Iteration 1480, loss = 2.30321
- I1231 10:04:02.302585 3750 solver.cpp:244] Train net output #0: loss = 2.30321 (* 1 = 2.30321 loss)
- I1231 10:04:02.302588 3750 sgd_solver.cpp:106] Iteration 1480, lr = 0.001
- I1231 10:04:04.999634 3750 solver.cpp:228] Iteration 1500, loss = 2.31914
- I1231 10:04:04.999655 3750 solver.cpp:244] Train net output #0: loss = 2.31914 (* 1 = 2.31914 loss)
- I1231 10:04:04.999660 3750 sgd_solver.cpp:106] Iteration 1500, lr = 0.001
- I1231 10:04:07.734257 3750 solver.cpp:228] Iteration 1520, loss = 2.36144
- I1231 10:04:07.734282 3750 solver.cpp:244] Train net output #0: loss = 2.36144 (* 1 = 2.36144 loss)
- I1231 10:04:07.734287 3750 sgd_solver.cpp:106] Iteration 1520, lr = 0.001
- I1231 10:04:10.445163 3750 solver.cpp:228] Iteration 1540, loss = 2.36364
- I1231 10:04:10.445186 3750 solver.cpp:244] Train net output #0: loss = 2.36364 (* 1 = 2.36364 loss)
- I1231 10:04:10.445190 3750 sgd_solver.cpp:106] Iteration 1540, lr = 0.001
- I1231 10:04:13.147294 3750 solver.cpp:228] Iteration 1560, loss = 2.30884
- I1231 10:04:13.147315 3750 solver.cpp:244] Train net output #0: loss = 2.30884 (* 1 = 2.30884 loss)
- I1231 10:04:13.147320 3750 sgd_solver.cpp:106] Iteration 1560, lr = 0.001
- I1231 10:04:15.847362 3750 solver.cpp:228] Iteration 1580, loss = 2.33459
- I1231 10:04:15.847383 3750 solver.cpp:244] Train net output #0: loss = 2.33459 (* 1 = 2.33459 loss)
- I1231 10:04:15.847386 3750 sgd_solver.cpp:106] Iteration 1580, lr = 0.001
- I1231 10:04:18.543267 3750 solver.cpp:228] Iteration 1600, loss = 2.31264
- I1231 10:04:18.543288 3750 solver.cpp:244] Train net output #0: loss = 2.31264 (* 1 = 2.31264 loss)
- I1231 10:04:18.543293 3750 sgd_solver.cpp:106] Iteration 1600, lr = 0.001
- I1231 10:04:21.242331 3750 solver.cpp:228] Iteration 1620, loss = 2.35213
- I1231 10:04:21.242353 3750 solver.cpp:244] Train net output #0: loss = 2.35213 (* 1 = 2.35213 loss)
- I1231 10:04:21.242357 3750 sgd_solver.cpp:106] Iteration 1620, lr = 0.001
- I1231 10:04:23.938932 3750 solver.cpp:228] Iteration 1640, loss = 2.30634
- I1231 10:04:23.938957 3750 solver.cpp:244] Train net output #0: loss = 2.30634 (* 1 = 2.30634 loss)
- I1231 10:04:23.938963 3750 sgd_solver.cpp:106] Iteration 1640, lr = 0.001
- I1231 10:04:26.635504 3750 solver.cpp:228] Iteration 1660, loss = 2.2856
- I1231 10:04:26.635525 3750 solver.cpp:244] Train net output #0: loss = 2.2856 (* 1 = 2.2856 loss)
- I1231 10:04:26.635529 3750 sgd_solver.cpp:106] Iteration 1660, lr = 0.001
- I1231 10:04:29.332518 3750 solver.cpp:228] Iteration 1680, loss = 2.27774
- I1231 10:04:29.332541 3750 solver.cpp:244] Train net output #0: loss = 2.27774 (* 1 = 2.27774 loss)
- I1231 10:04:29.332545 3750 sgd_solver.cpp:106] Iteration 1680, lr = 0.001
- I1231 10:04:32.028904 3750 solver.cpp:228] Iteration 1700, loss = 2.34021
- I1231 10:04:32.029019 3750 solver.cpp:244] Train net output #0: loss = 2.34021 (* 1 = 2.34021 loss)
- I1231 10:04:32.029024 3750 sgd_solver.cpp:106] Iteration 1700, lr = 0.001
- I1231 10:04:34.727354 3750 solver.cpp:228] Iteration 1720, loss = 2.34049
- I1231 10:04:34.727377 3750 solver.cpp:244] Train net output #0: loss = 2.34049 (* 1 = 2.34049 loss)
- I1231 10:04:34.727382 3750 sgd_solver.cpp:106] Iteration 1720, lr = 0.001
- I1231 10:04:37.425602 3750 solver.cpp:228] Iteration 1740, loss = 2.32911
- I1231 10:04:37.425624 3750 solver.cpp:244] Train net output #0: loss = 2.32911 (* 1 = 2.32911 loss)
- I1231 10:04:37.425629 3750 sgd_solver.cpp:106] Iteration 1740, lr = 0.001
- I1231 10:04:40.125151 3750 solver.cpp:228] Iteration 1760, loss = 2.31598
- I1231 10:04:40.125174 3750 solver.cpp:244] Train net output #0: loss = 2.31598 (* 1 = 2.31598 loss)
- I1231 10:04:40.125177 3750 sgd_solver.cpp:106] Iteration 1760, lr = 0.001
- I1231 10:04:42.824857 3750 solver.cpp:228] Iteration 1780, loss = 2.33156
- I1231 10:04:42.824879 3750 solver.cpp:244] Train net output #0: loss = 2.33156 (* 1 = 2.33156 loss)
- I1231 10:04:42.824883 3750 sgd_solver.cpp:106] Iteration 1780, lr = 0.001
- I1231 10:04:45.522495 3750 solver.cpp:228] Iteration 1800, loss = 2.34056
- I1231 10:04:45.522517 3750 solver.cpp:244] Train net output #0: loss = 2.34056 (* 1 = 2.34056 loss)
- I1231 10:04:45.522521 3750 sgd_solver.cpp:106] Iteration 1800, lr = 0.001
- I1231 10:04:48.254500 3750 solver.cpp:228] Iteration 1820, loss = 2.30518
- I1231 10:04:48.254523 3750 solver.cpp:244] Train net output #0: loss = 2.30518 (* 1 = 2.30518 loss)
- I1231 10:04:48.254528 3750 sgd_solver.cpp:106] Iteration 1820, lr = 0.001
- I1231 10:04:50.992200 3750 solver.cpp:228] Iteration 1840, loss = 2.32255
- I1231 10:04:50.992223 3750 solver.cpp:244] Train net output #0: loss = 2.32255 (* 1 = 2.32255 loss)
- I1231 10:04:50.992226 3750 sgd_solver.cpp:106] Iteration 1840, lr = 0.001
- I1231 10:04:53.703897 3750 solver.cpp:228] Iteration 1860, loss = 2.30668
- I1231 10:04:53.703946 3750 solver.cpp:244] Train net output #0: loss = 2.30668 (* 1 = 2.30668 loss)
- I1231 10:04:53.703953 3750 sgd_solver.cpp:106] Iteration 1860, lr = 0.001
- I1231 10:04:56.405320 3750 solver.cpp:228] Iteration 1880, loss = 2.29324
- I1231 10:04:56.405341 3750 solver.cpp:244] Train net output #0: loss = 2.29324 (* 1 = 2.29324 loss)
- I1231 10:04:56.405346 3750 sgd_solver.cpp:106] Iteration 1880, lr = 0.001
- I1231 10:04:59.104897 3750 solver.cpp:228] Iteration 1900, loss = 2.29406
- I1231 10:04:59.104918 3750 solver.cpp:244] Train net output #0: loss = 2.29406 (* 1 = 2.29406 loss)
- I1231 10:04:59.104923 3750 sgd_solver.cpp:106] Iteration 1900, lr = 0.001
- I1231 10:05:01.806949 3750 solver.cpp:228] Iteration 1920, loss = 2.30289
- I1231 10:05:01.806970 3750 solver.cpp:244] Train net output #0: loss = 2.30289 (* 1 = 2.30289 loss)
- I1231 10:05:01.806974 3750 sgd_solver.cpp:106] Iteration 1920, lr = 0.001
- I1231 10:05:04.504257 3750 solver.cpp:228] Iteration 1940, loss = 2.31311
- I1231 10:05:04.504365 3750 solver.cpp:244] Train net output #0: loss = 2.31311 (* 1 = 2.31311 loss)
- I1231 10:05:04.504372 3750 sgd_solver.cpp:106] Iteration 1940, lr = 0.001
- I1231 10:05:07.205976 3750 solver.cpp:228] Iteration 1960, loss = 2.30893
- I1231 10:05:07.205998 3750 solver.cpp:244] Train net output #0: loss = 2.30893 (* 1 = 2.30893 loss)
- I1231 10:05:07.206003 3750 sgd_solver.cpp:106] Iteration 1960, lr = 0.001
- I1231 10:05:09.903568 3750 solver.cpp:228] Iteration 1980, loss = 2.2799
- I1231 10:05:09.903589 3750 solver.cpp:244] Train net output #0: loss = 2.2799 (* 1 = 2.2799 loss)
- I1231 10:05:09.903594 3750 sgd_solver.cpp:106] Iteration 1980, lr = 0.001
- I1231 10:05:12.469777 3750 solver.cpp:454] Snapshotting to binary proto file models/bvlc_alexnet/caffe_alexnet_train_iter_2000.caffemodel
- I1231 10:05:13.320540 3750 sgd_solver.cpp:273] Snapshotting solver state to binary proto file models/bvlc_alexnet/caffe_alexnet_train_iter_2000.solverstate
- I1231 10:05:13.880272 3750 solver.cpp:317] Iteration 2000, loss = 2.34475
- I1231 10:05:13.880293 3750 solver.cpp:337] Iteration 2000, Testing net (#0)
- I1231 10:06:23.795788 3750 solver.cpp:404] Test net output #0: accuracy = 0.1135
- I1231 10:06:23.795878 3750 solver.cpp:404] Test net output #1: loss = 2.3047 (* 1 = 2.3047 loss)
- I1231 10:06:23.795884 3750 solver.cpp:322] Optimization Done.
- I1231 10:06:23.795887 3750 caffe.cpp:254] Optimization Done.
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement