Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- WARNING: Logging before InitGoogleLogging() is written to STDERR
- E0331 20:43:48.882966 3129 upgrade_proto.cpp:618] Attempting to upgrade input file specified using deprecated V1LayerParameter: ./model-defs/rcnn_batch_256_output_fc7.prototxt
- I0331 20:43:48.883144 3129 upgrade_proto.cpp:626] Successfully upgraded file specified using deprecated V1LayerParameter
- I0331 20:43:48.883358 3129 net.cpp:42] Initializing net from parameters:
- input: "data"
- input_dim: 256
- input_dim: 3
- input_dim: 227
- input_dim: 227
- state {
- phase: TEST
- }
- layer {
- name: "conv1"
- type: "Convolution"
- bottom: "data"
- top: "conv1"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 96
- kernel_size: 11
- stride: 4
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0
- }
- }
- }
- layer {
- name: "relu1"
- type: "ReLU"
- bottom: "conv1"
- top: "conv1"
- }
- layer {
- name: "pool1"
- type: "Pooling"
- bottom: "conv1"
- top: "pool1"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 2
- }
- }
- layer {
- name: "norm1"
- type: "LRN"
- bottom: "pool1"
- top: "norm1"
- lrn_param {
- local_size: 5
- alpha: 0.0001
- beta: 0.75
- }
- }
- layer {
- name: "conv2"
- type: "Convolution"
- bottom: "norm1"
- top: "conv2"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 256
- pad: 2
- kernel_size: 5
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 1
- }
- }
- }
- layer {
- name: "relu2"
- type: "ReLU"
- bottom: "conv2"
- top: "conv2"
- }
- layer {
- name: "pool2"
- type: "Pooling"
- bottom: "conv2"
- top: "pool2"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 2
- }
- }
- layer {
- name: "norm2"
- type: "LRN"
- bottom: "pool2"
- top: "norm2"
- lrn_param {
- local_size: 5
- alpha: 0.0001
- beta: 0.75
- }
- }
- layer {
- name: "conv3"
- type: "Convolution"
- bottom: "norm2"
- top: "conv3"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 384
- pad: 1
- kernel_size: 3
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 0
- }
- }
- }
- layer {
- name: "relu3"
- type: "ReLU"
- bottom: "conv3"
- top: "conv3"
- }
- layer {
- name: "conv4"
- type: "Convolution"
- bottom: "conv3"
- top: "conv4"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 384
- pad: 1
- kernel_size: 3
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 1
- }
- }
- }
- layer {
- name: "relu4"
- type: "ReLU"
- bottom: "conv4"
- top: "conv4"
- }
- layer {
- name: "conv5"
- type: "Convolution"
- bottom: "conv4"
- top: "conv5"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- convolution_param {
- num_output: 256
- pad: 1
- kernel_size: 3
- group: 2
- weight_filler {
- type: "gaussian"
- std: 0.01
- }
- bias_filler {
- type: "constant"
- value: 1
- }
- }
- }
- layer {
- name: "relu5"
- type: "ReLU"
- bottom: "conv5"
- top: "conv5"
- }
- layer {
- name: "pool5"
- type: "Pooling"
- bottom: "conv5"
- top: "pool5"
- pooling_param {
- pool: MAX
- kernel_size: 3
- stride: 2
- }
- }
- layer {
- name: "fc6"
- type: "InnerProduct"
- bottom: "pool5"
- top: "fc6"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- inner_product_param {
- num_output: 4096
- weight_filler {
- type: "gaussian"
- std: 0.005
- }
- bias_filler {
- type: "constant"
- value: 1
- }
- }
- }
- layer {
- name: "relu6"
- type: "ReLU"
- bottom: "fc6"
- top: "fc6"
- }
- layer {
- name: "fc7"
- type: "InnerProduct"
- bottom: "fc6"
- top: "fc7"
- param {
- lr_mult: 1
- decay_mult: 1
- }
- param {
- lr_mult: 2
- decay_mult: 0
- }
- inner_product_param {
- num_output: 4096
- weight_filler {
- type: "gaussian"
- std: 0.005
- }
- bias_filler {
- type: "constant"
- value: 1
- }
- }
- }
- layer {
- name: "relu7"
- type: "ReLU"
- bottom: "fc7"
- top: "fc7"
- }
- I0331 20:43:48.884116 3129 net.cpp:340] Input 0 -> data
- I0331 20:43:48.891572 3129 layer_factory.hpp:74] Creating layer conv1
- I0331 20:43:48.891630 3129 net.cpp:84] Creating Layer conv1
- I0331 20:43:48.891639 3129 net.cpp:380] conv1 <- data
- I0331 20:43:48.891652 3129 net.cpp:338] conv1 -> conv1
- I0331 20:43:48.891667 3129 net.cpp:113] Setting up conv1
- I0331 20:43:48.910981 3129 net.cpp:120] Top shape: 256 96 55 55 (74342400)
- I0331 20:43:48.911026 3129 layer_factory.hpp:74] Creating layer relu1
- I0331 20:43:48.911041 3129 net.cpp:84] Creating Layer relu1
- I0331 20:43:48.911051 3129 net.cpp:380] relu1 <- conv1
- I0331 20:43:48.911062 3129 net.cpp:327] relu1 -> conv1 (in-place)
- I0331 20:43:48.911073 3129 net.cpp:113] Setting up relu1
- I0331 20:43:48.911084 3129 net.cpp:120] Top shape: 256 96 55 55 (74342400)
- I0331 20:43:48.911093 3129 layer_factory.hpp:74] Creating layer pool1
- I0331 20:43:48.911104 3129 net.cpp:84] Creating Layer pool1
- I0331 20:43:48.911113 3129 net.cpp:380] pool1 <- conv1
- I0331 20:43:48.911121 3129 net.cpp:338] pool1 -> pool1
- I0331 20:43:48.911133 3129 net.cpp:113] Setting up pool1
- I0331 20:43:48.911149 3129 net.cpp:120] Top shape: 256 96 27 27 (17915904)
- I0331 20:43:48.911157 3129 layer_factory.hpp:74] Creating layer norm1
- I0331 20:43:48.911684 3129 net.cpp:84] Creating Layer norm1
- I0331 20:43:48.911717 3129 net.cpp:380] norm1 <- pool1
- I0331 20:43:48.911736 3129 net.cpp:338] norm1 -> norm1
- I0331 20:43:48.911757 3129 net.cpp:113] Setting up norm1
- I0331 20:43:48.911787 3129 net.cpp:120] Top shape: 256 96 27 27 (17915904)
- I0331 20:43:48.911806 3129 layer_factory.hpp:74] Creating layer conv2
- I0331 20:43:48.911836 3129 net.cpp:84] Creating Layer conv2
- I0331 20:43:48.911852 3129 net.cpp:380] conv2 <- norm1
- I0331 20:43:48.911870 3129 net.cpp:338] conv2 -> conv2
- I0331 20:43:48.911905 3129 net.cpp:113] Setting up conv2
- I0331 20:43:48.930404 3129 net.cpp:120] Top shape: 256 256 27 27 (47775744)
- I0331 20:43:48.930447 3129 layer_factory.hpp:74] Creating layer relu2
- I0331 20:43:48.930462 3129 net.cpp:84] Creating Layer relu2
- I0331 20:43:48.930472 3129 net.cpp:380] relu2 <- conv2
- I0331 20:43:48.930482 3129 net.cpp:327] relu2 -> conv2 (in-place)
- I0331 20:43:48.930493 3129 net.cpp:113] Setting up relu2
- I0331 20:43:48.930508 3129 net.cpp:120] Top shape: 256 256 27 27 (47775744)
- I0331 20:43:48.930517 3129 layer_factory.hpp:74] Creating layer pool2
- I0331 20:43:48.930527 3129 net.cpp:84] Creating Layer pool2
- I0331 20:43:48.930536 3129 net.cpp:380] pool2 <- conv2
- I0331 20:43:48.930544 3129 net.cpp:338] pool2 -> pool2
- I0331 20:43:48.930554 3129 net.cpp:113] Setting up pool2
- I0331 20:43:48.930568 3129 net.cpp:120] Top shape: 256 256 13 13 (11075584)
- I0331 20:43:48.930579 3129 layer_factory.hpp:74] Creating layer norm2
- I0331 20:43:48.930590 3129 net.cpp:84] Creating Layer norm2
- I0331 20:43:48.930600 3129 net.cpp:380] norm2 <- pool2
- I0331 20:43:48.930609 3129 net.cpp:338] norm2 -> norm2
- I0331 20:43:48.930619 3129 net.cpp:113] Setting up norm2
- I0331 20:43:48.930630 3129 net.cpp:120] Top shape: 256 256 13 13 (11075584)
- I0331 20:43:48.930639 3129 layer_factory.hpp:74] Creating layer conv3
- I0331 20:43:48.930651 3129 net.cpp:84] Creating Layer conv3
- I0331 20:43:48.930660 3129 net.cpp:380] conv3 <- norm2
- I0331 20:43:48.930670 3129 net.cpp:338] conv3 -> conv3
- I0331 20:43:48.930680 3129 net.cpp:113] Setting up conv3
- I0331 20:43:48.957094 3129 net.cpp:120] Top shape: 256 384 13 13 (16613376)
- I0331 20:43:48.957142 3129 layer_factory.hpp:74] Creating layer relu3
- I0331 20:43:48.957159 3129 net.cpp:84] Creating Layer relu3
- I0331 20:43:48.957167 3129 net.cpp:380] relu3 <- conv3
- I0331 20:43:48.957190 3129 net.cpp:327] relu3 -> conv3 (in-place)
- I0331 20:43:48.957217 3129 net.cpp:113] Setting up relu3
- I0331 20:43:48.957228 3129 net.cpp:120] Top shape: 256 384 13 13 (16613376)
- I0331 20:43:48.957237 3129 layer_factory.hpp:74] Creating layer conv4
- I0331 20:43:48.957248 3129 net.cpp:84] Creating Layer conv4
- I0331 20:43:48.957262 3129 net.cpp:380] conv4 <- conv3
- I0331 20:43:48.957273 3129 net.cpp:338] conv4 -> conv4
- I0331 20:43:48.957289 3129 net.cpp:113] Setting up conv4
- I0331 20:43:48.977625 3129 net.cpp:120] Top shape: 256 384 13 13 (16613376)
- I0331 20:43:48.977668 3129 layer_factory.hpp:74] Creating layer relu4
- I0331 20:43:48.977682 3129 net.cpp:84] Creating Layer relu4
- I0331 20:43:48.977691 3129 net.cpp:380] relu4 <- conv4
- I0331 20:43:48.977701 3129 net.cpp:327] relu4 -> conv4 (in-place)
- I0331 20:43:48.977713 3129 net.cpp:113] Setting up relu4
- I0331 20:43:48.977723 3129 net.cpp:120] Top shape: 256 384 13 13 (16613376)
- I0331 20:43:48.977731 3129 layer_factory.hpp:74] Creating layer conv5
- I0331 20:43:48.977743 3129 net.cpp:84] Creating Layer conv5
- I0331 20:43:48.977751 3129 net.cpp:380] conv5 <- conv4
- I0331 20:43:48.977764 3129 net.cpp:338] conv5 -> conv5
- I0331 20:43:48.977777 3129 net.cpp:113] Setting up conv5
- I0331 20:43:48.991324 3129 net.cpp:120] Top shape: 256 256 13 13 (11075584)
- I0331 20:43:48.991360 3129 layer_factory.hpp:74] Creating layer relu5
- I0331 20:43:48.991375 3129 net.cpp:84] Creating Layer relu5
- I0331 20:43:48.991384 3129 net.cpp:380] relu5 <- conv5
- I0331 20:43:48.991395 3129 net.cpp:327] relu5 -> conv5 (in-place)
- I0331 20:43:48.991405 3129 net.cpp:113] Setting up relu5
- I0331 20:43:48.991415 3129 net.cpp:120] Top shape: 256 256 13 13 (11075584)
- I0331 20:43:48.991423 3129 layer_factory.hpp:74] Creating layer pool5
- I0331 20:43:48.991433 3129 net.cpp:84] Creating Layer pool5
- I0331 20:43:48.991447 3129 net.cpp:380] pool5 <- conv5
- I0331 20:43:48.991458 3129 net.cpp:338] pool5 -> pool5
- I0331 20:43:48.991471 3129 net.cpp:113] Setting up pool5
- I0331 20:43:48.991483 3129 net.cpp:120] Top shape: 256 256 6 6 (2359296)
- I0331 20:43:48.991492 3129 layer_factory.hpp:74] Creating layer fc6
- I0331 20:43:48.991508 3129 net.cpp:84] Creating Layer fc6
- I0331 20:43:48.991518 3129 net.cpp:380] fc6 <- pool5
- I0331 20:43:48.991529 3129 net.cpp:338] fc6 -> fc6
- I0331 20:43:48.991540 3129 net.cpp:113] Setting up fc6
- I0331 20:43:50.034123 3129 net.cpp:120] Top shape: 256 4096 (1048576)
- I0331 20:43:50.034180 3129 layer_factory.hpp:74] Creating layer relu6
- I0331 20:43:50.034198 3129 net.cpp:84] Creating Layer relu6
- I0331 20:43:50.034207 3129 net.cpp:380] relu6 <- fc6
- I0331 20:43:50.034219 3129 net.cpp:327] relu6 -> fc6 (in-place)
- I0331 20:43:50.034230 3129 net.cpp:113] Setting up relu6
- I0331 20:43:50.034241 3129 net.cpp:120] Top shape: 256 4096 (1048576)
- I0331 20:43:50.034250 3129 layer_factory.hpp:74] Creating layer fc7
- I0331 20:43:50.034265 3129 net.cpp:84] Creating Layer fc7
- I0331 20:43:50.034278 3129 net.cpp:380] fc7 <- fc6
- I0331 20:43:50.034288 3129 net.cpp:338] fc7 -> fc7
- I0331 20:43:50.034299 3129 net.cpp:113] Setting up fc7
- I0331 20:43:50.497591 3129 net.cpp:120] Top shape: 256 4096 (1048576)
- I0331 20:43:50.497642 3129 layer_factory.hpp:74] Creating layer relu7
- I0331 20:43:50.497656 3129 net.cpp:84] Creating Layer relu7
- I0331 20:43:50.497666 3129 net.cpp:380] relu7 <- fc7
- I0331 20:43:50.497680 3129 net.cpp:327] relu7 -> fc7 (in-place)
- I0331 20:43:50.497692 3129 net.cpp:113] Setting up relu7
- I0331 20:43:50.497704 3129 net.cpp:120] Top shape: 256 4096 (1048576)
- I0331 20:43:50.497711 3129 net.cpp:169] relu7 does not need backward computation.
- I0331 20:43:50.497720 3129 net.cpp:169] fc7 does not need backward computation.
- I0331 20:43:50.497727 3129 net.cpp:169] relu6 does not need backward computation.
- I0331 20:43:50.497735 3129 net.cpp:169] fc6 does not need backward computation.
- I0331 20:43:50.497743 3129 net.cpp:169] pool5 does not need backward computation.
- I0331 20:43:50.497751 3129 net.cpp:169] relu5 does not need backward computation.
- I0331 20:43:50.497758 3129 net.cpp:169] conv5 does not need backward computation.
- I0331 20:43:50.497766 3129 net.cpp:169] relu4 does not need backward computation.
- I0331 20:43:50.497774 3129 net.cpp:169] conv4 does not need backward computation.
- I0331 20:43:50.497782 3129 net.cpp:169] relu3 does not need backward computation.
- I0331 20:43:50.497791 3129 net.cpp:169] conv3 does not need backward computation.
- I0331 20:43:50.497798 3129 net.cpp:169] norm2 does not need backward computation.
- I0331 20:43:50.497807 3129 net.cpp:169] pool2 does not need backward computation.
- I0331 20:43:50.497814 3129 net.cpp:169] relu2 does not need backward computation.
- I0331 20:43:50.497822 3129 net.cpp:169] conv2 does not need backward computation.
- I0331 20:43:50.497830 3129 net.cpp:169] norm1 does not need backward computation.
- I0331 20:43:50.497838 3129 net.cpp:169] pool1 does not need backward computation.
- I0331 20:43:50.497846 3129 net.cpp:169] relu1 does not need backward computation.
- I0331 20:43:50.497854 3129 net.cpp:169] conv1 does not need backward computation.
- I0331 20:43:50.497861 3129 net.cpp:205] This network produces output fc7
- I0331 20:43:50.497879 3129 net.cpp:447] Collecting Learning Rate and Weight Decay.
- I0331 20:43:50.497895 3129 net.cpp:217] Network initialization done.
- I0331 20:43:50.497905 3129 net.cpp:218] Memory required for data: 1589510144
- E0331 20:43:53.882091 3129 upgrade_proto.cpp:609] Attempting to upgrade input file specified using deprecated transformation parameters: ./data/caffe_nets/finetune_voc_2012_train_iter_70k
- I0331 20:43:53.882133 3129 upgrade_proto.cpp:612] Successfully upgraded file specified using deprecated data transformation parameters.
- E0331 20:43:53.882143 3129 upgrade_proto.cpp:614] Note that future Caffe releases will only support transform_param messages for transformation fields.
- E0331 20:43:53.882150 3129 upgrade_proto.cpp:618] Attempting to upgrade input file specified using deprecated V1LayerParameter: ./data/caffe_nets/finetune_voc_2012_train_iter_70k
- I0331 20:43:54.077909 3129 upgrade_proto.cpp:626] Successfully upgraded file specified using deprecated V1LayerParameter
- F0331 20:44:24.422291 3129 syncedmem.hpp:27] Check failed: *ptr host allocation of size 44302336 failed
- *** Check failure stack trace: ***
- ./matlab: line 1: 2990 Killed LD_PRELOAD=/usr/lib/x86_64-linux-gnu/libfreetype.so.6 matlab
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement