Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- I1227 01:35:41.192610 3821 caffe.cpp:217] Using GPUs 0
- I1227 01:35:41.256325 3821 caffe.cpp:222] GPU 0: GeForce GTX 1070
- I1227 01:35:41.766284 3821 solver.cpp:48] Initializing solver from parameters:
- test_iter: 1000
- test_interval: 1000
- base_lr: 0.01
- display: 20
- max_iter: 450000
- lr_policy: "step"
- gamma: 0.1
- momentum: 0.9
- weight_decay: 0.0005
- stepsize: 100000
- snapshot: 10000
- snapshot_prefix: "models/bvlc_alexnet/caffe_alexnet_train"
- solver_mode: GPU
- device_id: 0
- net: "models/bvlc_alexnet/train_valmnist.prototxt"
- train_state {
- level: 0
- stage: ""
- }
- I1227 01:35:41.766728 3821 solver.cpp:91] Creating training net from net file: models/bvlc_alexnet/train_valmnist.prototxt
- I1227 01:35:41.768218 3821 net.cpp:322] The NetState phase (0) differed from the phase (1) specified by a rule in layer data
- I1227 01:35:41.768262 3821 net.cpp:322] The NetState phase (0) differed from the phase (1) specified by a rule in layer accuracy
- I1227 01:35:41.768586 3821 net.cpp:58] Initializing net from parameters:
- name: "AlexNet"
- state {
- phase: TRAIN
- level: 0
- stage: ""
- }
- layer {
- name: "data"
- type: "Data"
- top: "data"
- top: "label"
- include {
- phase: TRAIN
- }
- transform_param {
- scale: 0.00390625
- }
- data_param {
- source: "examples/mnist/mnist_train_lmdb"
- batch_size: 64
- backend: LMDB
- }
- }
- layer {
- name: "conv1"
- type: "Convolution"
- bottom: "data"
- top: "conv1"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 96
- kernel_size: 11
- stride: 4
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0
- }
- }
- }
- layer {
- name: "relu1"
- type: "ReLU"
- bottom: "conv1"
- top: "conv1"
- }
- layer {
- name: "norm1"
- type: "LRN"
- bottom: "conv1"
- top: "norm1"
- lrn_param {
- local_size: 5
- alpha: 0.0001
- beta: 0.75
- }
- }
- layer {
- name: "pool1"
- type: "Pooling"
- bottom: "norm1"
- top: "pool1"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 2
- }
- }
- layer {
- name: "conv2"
- type: "Convolution"
- bottom: "pool1"
- top: "conv2"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 256
- pad: 2
- kernel_size: 5
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu2"
- type: "ReLU"
- bottom: "conv2"
- top: "conv2"
- }
- layer {
- name: "norm2"
- type: "LRN"
- bottom: "conv2"
- top: "norm2"
- lrn_param {
- local_size: 5
- alpha: 0.0001
- beta: 0.75
- }
- }
- layer {
- name: "pool2"
- type: "Pooling"
- bottom: "norm2"
- top: "pool2"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 2
- }
- }
- layer {
- name: "conv3"
- type: "Convolution"
- bottom: "pool2"
- top: "conv3"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 384
- pad: 1
- kernel_size: 3
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0
- }
- }
- }
- layer {
- name: "relu3"
- type: "ReLU"
- bottom: "conv3"
- top: "conv3"
- }
- layer {
- name: "conv4"
- type: "Convolution"
- bottom: "conv3"
- top: "conv4"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 384
- pad: 1
- kernel_size: 3
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu4"
- type: "ReLU"
- bottom: "conv4"
- top: "conv4"
- }
- layer {
- name: "conv5"
- type: "Convolution"
- bottom: "conv4"
- top: "conv5"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 256
- pad: 1
- kernel_size: 3
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu5"
- type: "ReLU"
- bottom: "conv5"
- top: "conv5"
- }
- layer {
- name: "pool5"
- type: "Pooling"
- bottom: "conv5"
- top: "pool5"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 2
- }
- }
- layer {
- name: "fc6"
- type: "InnerProduct"
- bottom: "pool5"
- top: "fc6"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- inner_product_param {
- num_output: 4096
- weight_filler {
- type: "gaussian"
- std: 0.005
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu6"
- type: "ReLU"
- bottom: "fc6"
- top: "fc6"
- }
- layer {
- name: "drop6"
- type: "Dropout"
- bottom: "fc6"
- top: "fc6"
- dropout_param {
- dropout_ratio: 0.5
- }
- }
- layer {
- name: "fc7"
- type: "InnerProduct"
- bottom: "fc6"
- top: "fc7"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- inner_product_param {
- num_output: 4096
- weight_filler {
- type: "gaussian"
- std: 0.005
- }
- bias_filler {
- type: "constant"
- value: 0.1
- }
- }
- }
- layer {
- name: "relu7"
- type: "ReLU"
- bottom: "fc7"
- top: "fc7"
- }
- layer {
- name: "drop7"
- type: "Dropout"
- bottom: "fc7"
- top: "fc7"
- dropout_param {
- dropout_ratio: 0.5
- }
- }
- layer {
- name: "fc8"
- type: "InnerProduct"
- bottom: "fc7"
- top: "fc8"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- inner_product_param {
- num_output: 1000
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0
- }
- }
- }
- layer {
- name: "loss"
- type: "SoftmaxWithLoss"
- bottom: "fc8"
- bottom: "label"
- top: "loss"
- }
- I1227 01:35:41.769074 3821 layer_factory.hpp:77] Creating layer data
- I1227 01:35:41.770401 3821 net.cpp:100] Creating Layer data
- I1227 01:35:41.770426 3821 net.cpp:408] data -> data
- I1227 01:35:41.770462 3821 net.cpp:408] data -> label
- I1227 01:35:41.772739 3830 db_lmdb.cpp:35] Opened lmdb examples/mnist/mnist_train_lmdb
- I1227 01:35:41.854802 3821 data_layer.cpp:41] output data size: 64,1,28,28
- I1227 01:35:41.856761 3821 net.cpp:150] Setting up data
- I1227 01:35:41.856784 3821 net.cpp:157] Top shape: 64 1 28 28 (50176)
- I1227 01:35:41.856791 3821 net.cpp:157] Top shape: 64 (64)
- I1227 01:35:41.856796 3821 net.cpp:165] Memory required for data: 200960
- I1227 01:35:41.856808 3821 layer_factory.hpp:77] Creating layer conv1
- I1227 01:35:41.856842 3821 net.cpp:100] Creating Layer conv1
- I1227 01:35:41.856850 3821 net.cpp:434] conv1 <- data
- I1227 01:35:41.856869 3821 net.cpp:408] conv1 -> conv1
- I1227 01:35:41.858649 3821 net.cpp:150] Setting up conv1
- I1227 01:35:41.858671 3821 net.cpp:157] Top shape: 64 96 5 5 (153600)
- I1227 01:35:41.858676 3821 net.cpp:165] Memory required for data: 815360
- I1227 01:35:41.858700 3821 layer_factory.hpp:77] Creating layer relu1
- I1227 01:35:41.858716 3821 net.cpp:100] Creating Layer relu1
- I1227 01:35:41.858721 3821 net.cpp:434] relu1 <- conv1
- I1227 01:35:41.858731 3821 net.cpp:395] relu1 -> conv1 (in-place)
- I1227 01:35:41.858742 3821 net.cpp:150] Setting up relu1
- I1227 01:35:41.858752 3821 net.cpp:157] Top shape: 64 96 5 5 (153600)
- I1227 01:35:41.858757 3821 net.cpp:165] Memory required for data: 1429760
- I1227 01:35:41.858763 3821 layer_factory.hpp:77] Creating layer norm1
- I1227 01:35:41.858772 3821 net.cpp:100] Creating Layer norm1
- I1227 01:35:41.858783 3821 net.cpp:434] norm1 <- conv1
- I1227 01:35:41.858791 3821 net.cpp:408] norm1 -> norm1
- I1227 01:35:41.858846 3821 net.cpp:150] Setting up norm1
- I1227 01:35:41.858860 3821 net.cpp:157] Top shape: 64 96 5 5 (153600)
- I1227 01:35:41.858866 3821 net.cpp:165] Memory required for data: 2044160
- I1227 01:35:41.858872 3821 layer_factory.hpp:77] Creating layer pool1
- I1227 01:35:41.858882 3821 net.cpp:100] Creating Layer pool1
- I1227 01:35:41.858888 3821 net.cpp:434] pool1 <- norm1
- I1227 01:35:41.858897 3821 net.cpp:408] pool1 -> pool1
- I1227 01:35:41.858959 3821 net.cpp:150] Setting up pool1
- I1227 01:35:41.858969 3821 net.cpp:157] Top shape: 64 96 2 2 (24576)
- I1227 01:35:41.858999 3821 net.cpp:165] Memory required for data: 2142464
- I1227 01:35:41.859007 3821 layer_factory.hpp:77] Creating layer conv2
- I1227 01:35:41.859025 3821 net.cpp:100] Creating Layer conv2
- I1227 01:35:41.859030 3821 net.cpp:434] conv2 <- pool1
- I1227 01:35:41.859041 3821 net.cpp:408] conv2 -> conv2
- I1227 01:35:41.864717 3821 net.cpp:150] Setting up conv2
- I1227 01:35:41.864732 3821 net.cpp:157] Top shape: 64 256 2 2 (65536)
- I1227 01:35:41.864738 3821 net.cpp:165] Memory required for data: 2404608
- I1227 01:35:41.864751 3821 layer_factory.hpp:77] Creating layer relu2
- I1227 01:35:41.864761 3821 net.cpp:100] Creating Layer relu2
- I1227 01:35:41.864766 3821 net.cpp:434] relu2 <- conv2
- I1227 01:35:41.864773 3821 net.cpp:395] relu2 -> conv2 (in-place)
- I1227 01:35:41.864783 3821 net.cpp:150] Setting up relu2
- I1227 01:35:41.864789 3821 net.cpp:157] Top shape: 64 256 2 2 (65536)
- I1227 01:35:41.864794 3821 net.cpp:165] Memory required for data: 2666752
- I1227 01:35:41.864799 3821 layer_factory.hpp:77] Creating layer norm2
- I1227 01:35:41.864804 3821 net.cpp:100] Creating Layer norm2
- I1227 01:35:41.864809 3821 net.cpp:434] norm2 <- conv2
- I1227 01:35:41.864817 3821 net.cpp:408] norm2 -> norm2
- I1227 01:35:41.864857 3821 net.cpp:150] Setting up norm2
- I1227 01:35:41.864864 3821 net.cpp:157] Top shape: 64 256 2 2 (65536)
- I1227 01:35:41.864869 3821 net.cpp:165] Memory required for data: 2928896
- I1227 01:35:41.864874 3821 layer_factory.hpp:77] Creating layer pool2
- I1227 01:35:41.864883 3821 net.cpp:100] Creating Layer pool2
- I1227 01:35:41.864888 3821 net.cpp:434] pool2 <- norm2
- I1227 01:35:41.864895 3821 net.cpp:408] pool2 -> pool2
- I1227 01:35:41.864935 3821 net.cpp:150] Setting up pool2
- I1227 01:35:41.864943 3821 net.cpp:157] Top shape: 64 256 1 1 (16384)
- I1227 01:35:41.864948 3821 net.cpp:165] Memory required for data: 2994432
- I1227 01:35:41.864953 3821 layer_factory.hpp:77] Creating layer conv3
- I1227 01:35:41.864964 3821 net.cpp:100] Creating Layer conv3
- I1227 01:35:41.864969 3821 net.cpp:434] conv3 <- pool2
- I1227 01:35:41.864975 3821 net.cpp:408] conv3 -> conv3
- I1227 01:35:41.876174 3821 net.cpp:150] Setting up conv3
- I1227 01:35:41.876195 3821 net.cpp:157] Top shape: 64 384 1 1 (24576)
- I1227 01:35:41.876199 3821 net.cpp:165] Memory required for data: 3092736
- I1227 01:35:41.876211 3821 layer_factory.hpp:77] Creating layer relu3
- I1227 01:35:41.876220 3821 net.cpp:100] Creating Layer relu3
- I1227 01:35:41.876224 3821 net.cpp:434] relu3 <- conv3
- I1227 01:35:41.876230 3821 net.cpp:395] relu3 -> conv3 (in-place)
- I1227 01:35:41.876240 3821 net.cpp:150] Setting up relu3
- I1227 01:35:41.876245 3821 net.cpp:157] Top shape: 64 384 1 1 (24576)
- I1227 01:35:41.876250 3821 net.cpp:165] Memory required for data: 3191040
- I1227 01:35:41.876253 3821 layer_factory.hpp:77] Creating layer conv4
- I1227 01:35:41.876262 3821 net.cpp:100] Creating Layer conv4
- I1227 01:35:41.876267 3821 net.cpp:434] conv4 <- conv3
- I1227 01:35:41.876273 3821 net.cpp:408] conv4 -> conv4
- I1227 01:35:41.882702 3821 net.cpp:150] Setting up conv4
- I1227 01:35:41.882714 3821 net.cpp:157] Top shape: 64 384 1 1 (24576)
- I1227 01:35:41.882716 3821 net.cpp:165] Memory required for data: 3289344
- I1227 01:35:41.882722 3821 layer_factory.hpp:77] Creating layer relu4
- I1227 01:35:41.882728 3821 net.cpp:100] Creating Layer relu4
- I1227 01:35:41.882731 3821 net.cpp:434] relu4 <- conv4
- I1227 01:35:41.882736 3821 net.cpp:395] relu4 -> conv4 (in-place)
- I1227 01:35:41.882742 3821 net.cpp:150] Setting up relu4
- I1227 01:35:41.882745 3821 net.cpp:157] Top shape: 64 384 1 1 (24576)
- I1227 01:35:41.882748 3821 net.cpp:165] Memory required for data: 3387648
- I1227 01:35:41.882752 3821 layer_factory.hpp:77] Creating layer conv5
- I1227 01:35:41.882758 3821 net.cpp:100] Creating Layer conv5
- I1227 01:35:41.882761 3821 net.cpp:434] conv5 <- conv4
- I1227 01:35:41.882766 3821 net.cpp:408] conv5 -> conv5
- I1227 01:35:41.886646 3821 net.cpp:150] Setting up conv5
- I1227 01:35:41.886656 3821 net.cpp:157] Top shape: 64 256 1 1 (16384)
- I1227 01:35:41.886677 3821 net.cpp:165] Memory required for data: 3453184
- I1227 01:35:41.886687 3821 layer_factory.hpp:77] Creating layer relu5
- I1227 01:35:41.886694 3821 net.cpp:100] Creating Layer relu5
- I1227 01:35:41.886698 3821 net.cpp:434] relu5 <- conv5
- I1227 01:35:41.886703 3821 net.cpp:395] relu5 -> conv5 (in-place)
- I1227 01:35:41.886708 3821 net.cpp:150] Setting up relu5
- I1227 01:35:41.886713 3821 net.cpp:157] Top shape: 64 256 1 1 (16384)
- I1227 01:35:41.886714 3821 net.cpp:165] Memory required for data: 3518720
- I1227 01:35:41.886718 3821 layer_factory.hpp:77] Creating layer pool5
- I1227 01:35:41.886721 3821 net.cpp:100] Creating Layer pool5
- I1227 01:35:41.886724 3821 net.cpp:434] pool5 <- conv5
- I1227 01:35:41.886729 3821 net.cpp:408] pool5 -> pool5
- I1227 01:35:41.886754 3821 net.cpp:150] Setting up pool5
- I1227 01:35:41.886759 3821 net.cpp:157] Top shape: 64 256 0 0 (0)
- I1227 01:35:41.886761 3821 net.cpp:165] Memory required for data: 3518720
- I1227 01:35:41.886765 3821 layer_factory.hpp:77] Creating layer fc6
- I1227 01:35:41.886772 3821 net.cpp:100] Creating Layer fc6
- I1227 01:35:41.886775 3821 net.cpp:434] fc6 <- pool5
- I1227 01:35:41.886780 3821 net.cpp:408] fc6 -> fc6
- F1227 01:35:41.886826 3821 blob.cpp:115] Check failed: data_
- *** Check failure stack trace: ***
- @ 0x7f366ba695cd google::LogMessage::Fail()
- @ 0x7f366ba6b433 google::LogMessage::SendToLog()
- @ 0x7f366ba6915b google::LogMessage::Flush()
- @ 0x7f366ba6be1e google::LogMessageFatal::~LogMessageFatal()
- @ 0x7f366c217d3b caffe::Blob<>::mutable_cpu_data()
- @ 0x7f366c0f633a caffe::GaussianFiller<>::Fill()
- @ 0x7f366c0f6e27 caffe::InnerProductLayer<>::LayerSetUp()
- @ 0x7f366c0c17e2 caffe::Net<>::Init()
- @ 0x7f366c0c3071 caffe::Net<>::Net()
- @ 0x7f366c0d44aa caffe::Solver<>::InitTrainNet()
- @ 0x7f366c0d5817 caffe::Solver<>::Init()
- @ 0x7f366c0d5bba caffe::Solver<>::Solver()
- @ 0x7f366c20bc83 caffe::Creator_SGDSolver<>()
- @ 0x40afb9 train()
- @ 0x4077c8 main
- @ 0x7f366a200830 __libc_start_main
- @ 0x408099 _start
- @ (nil) (unknown)
- Aborted (core dumped)
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement