Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- ERROR - 19:53:02 - Cannot create Cublas handle. Cublas won't be available.
- ERROR - 19:53:02 - Cannot create Cusparse handle. Cusparse won't be available.
- WARNING: Logging before InitGoogleLogging() is written to STDERR
- I0318 19:53:02.194797 9291 caffelib.cc:132] instantiating model template mlp
- I0318 19:53:02.194895 9291 caffelib.cc:136] source=../templates/caffe//mlp/
- I0318 19:53:02.194933 9291 caffelib.cc:137] dest=models/n20/mlp.prototxt
- ERROR - 19:53:02 - Cannot create Curand generator. Curand won't be available.
- INFO - 19:53:02 - Sun Mar 18 19:53:02 2018 UTC - 127.0.0.1 "PUT /services/n20" 201 588
- INFO - 19:53:13 - Sun Mar 18 19:53:13 2018 UTC - 127.0.0.1 "POST /train" 201 0
- I0318 19:53:13.997033 9367 txtinputfileconn.cc:74] txtinputfileconn: list subdirs size=20
- I0318 19:53:15.462141 9367 txtinputfileconn.cc:231] vocabulary size=18488
- data split test size=3770 / remaining data size=15078
- vocab size=18488
- I0318 19:53:16.722906 9367 caffelib.cc:2203] user batch_size=300 / inputc batch_size=15078
- I0318 19:53:16.722939 9367 caffelib.cc:2248] batch_size=359 / test_batch_size=290 / test_iter=13
- ERROR - 19:53:16 - Cannot create Cublas handle. Cublas won't be available.
- ERROR - 19:53:16 - Cannot create Cusparse handle. Cusparse won't be available.
- ERROR - 19:53:16 - Cannot create Curand generator. Curand won't be available.
- INFO - 19:53:16 - Initializing solver from parameters:
- ERROR - 19:53:16 - Curand not available. Skipping setting the curand seed.
- INFO - 19:53:16 - Creating training net specified in net_param.
- INFO - 19:53:16 - The NetState phase (0) differed from the phase (1) specified by a rule in layer inputl
- INFO - 19:53:16 - The NetState phase (0) differed from the phase (1) specified by a rule in layer probt
- INFO - 19:53:16 - Initializing net from parameters:
- INFO - 19:53:16 - Creating layer / name=inputl / type=MemoryData
- INFO - 19:53:16 - Creating Layer inputl
- INFO - 19:53:16 - inputl -> data
- INFO - 19:53:16 - inputl -> label
- INFO - 19:53:16 - Setting up inputl
- INFO - 19:53:16 - Top shape: 359 18488 1 1 (6637192)
- INFO - 19:53:16 - Top shape: 359 (359)
- INFO - 19:53:16 - Memory required for data: 26550204
- INFO - 19:53:16 - Creating layer / name=fc_data / type=InnerProduct
- INFO - 19:53:16 - Creating Layer fc_data
- INFO - 19:53:16 - fc_data <- data
- INFO - 19:53:16 - fc_data -> ip0
- INFO - 19:53:16 - Setting up fc_data
- INFO - 19:53:16 - Top shape: 359 200 (71800)
- INFO - 19:53:16 - Memory required for data: 26837404
- INFO - 19:53:16 - Creating layer / name=act_ReLU_ip0 / type=ReLU
- INFO - 19:53:16 - Creating Layer act_ReLU_ip0
- INFO - 19:53:16 - act_ReLU_ip0 <- ip0
- INFO - 19:53:16 - act_ReLU_ip0 -> ip0 (in-place)
- INFO - 19:53:16 - Setting up act_ReLU_ip0
- INFO - 19:53:16 - Top shape: 359 200 (71800)
- INFO - 19:53:16 - Memory required for data: 27124604
- INFO - 19:53:16 - Creating layer / name=fc_ip0 / type=InnerProduct
- INFO - 19:53:16 - Creating Layer fc_ip0
- INFO - 19:53:16 - fc_ip0 <- ip0
- INFO - 19:53:16 - fc_ip0 -> ip1
- INFO - 19:53:16 - Setting up fc_ip0
- INFO - 19:53:16 - Top shape: 359 200 (71800)
- INFO - 19:53:16 - Memory required for data: 27411804
- INFO - 19:53:16 - Creating layer / name=act_ReLU_ip1 / type=ReLU
- INFO - 19:53:16 - Creating Layer act_ReLU_ip1
- INFO - 19:53:16 - act_ReLU_ip1 <- ip1
- INFO - 19:53:16 - act_ReLU_ip1 -> ip1 (in-place)
- INFO - 19:53:16 - Setting up act_ReLU_ip1
- INFO - 19:53:16 - Top shape: 359 200 (71800)
- INFO - 19:53:16 - Memory required for data: 27699004
- INFO - 19:53:16 - Creating layer / name=fc_ip1 / type=InnerProduct
- INFO - 19:53:16 - Creating Layer fc_ip1
- INFO - 19:53:16 - fc_ip1 <- ip1
- INFO - 19:53:16 - fc_ip1 -> ip_losst
- INFO - 19:53:16 - Setting up fc_ip1
- INFO - 19:53:16 - Top shape: 359 20 (7180)
- INFO - 19:53:16 - Memory required for data: 27727724
- INFO - 19:53:16 - Creating layer / name=prob / type=SoftmaxWithLoss
- INFO - 19:53:16 - Creating Layer prob
- INFO - 19:53:16 - prob <- ip_losst
- INFO - 19:53:16 - prob <- label
- INFO - 19:53:16 - prob -> losst
- INFO - 19:53:16 - Creating layer / name=prob / type=Softmax
- INFO - 19:53:16 - Setting up prob
- INFO - 19:53:16 - Top shape: (1)
- INFO - 19:53:16 - with loss weight 1
- INFO - 19:53:16 - Memory required for data: 27727728
- INFO - 19:53:16 - prob needs backward computation.
- INFO - 19:53:16 - fc_ip1 needs backward computation.
- INFO - 19:53:16 - act_ReLU_ip1 needs backward computation.
- INFO - 19:53:16 - fc_ip0 needs backward computation.
- INFO - 19:53:16 - act_ReLU_ip0 needs backward computation.
- INFO - 19:53:16 - fc_data needs backward computation.
- INFO - 19:53:16 - inputl does not need backward computation.
- INFO - 19:53:16 - This network produces output losst
- INFO - 19:53:16 - Network initialization done.
- INFO - 19:53:16 - Creating test net (#0) specified by net_param
- INFO - 19:53:16 - The NetState phase (1) differed from the phase (0) specified by a rule in layer inputl
- INFO - 19:53:16 - The NetState phase (1) differed from the phase (0) specified by a rule in layer prob
- INFO - 19:53:16 - Initializing net from parameters:
- INFO - 19:53:16 - Creating layer / name=inputl / type=MemoryData
- INFO - 19:53:16 - Creating Layer inputl
- INFO - 19:53:16 - inputl -> data
- INFO - 19:53:16 - inputl -> label
- INFO - 19:53:16 - Setting up inputl
- INFO - 19:53:16 - Top shape: 290 18488 1 1 (5361520)
- INFO - 19:53:16 - Top shape: 290 (290)
- INFO - 19:53:16 - Memory required for data: 21447240
- INFO - 19:53:16 - Creating layer / name=fc_data / type=InnerProduct
- INFO - 19:53:16 - Creating Layer fc_data
- INFO - 19:53:16 - fc_data <- data
- INFO - 19:53:16 - fc_data -> ip0
- INFO - 19:53:16 - Setting up fc_data
- INFO - 19:53:16 - Top shape: 290 200 (58000)
- INFO - 19:53:16 - Memory required for data: 21679240
- INFO - 19:53:16 - Creating layer / name=act_ReLU_ip0 / type=ReLU
- INFO - 19:53:16 - Creating Layer act_ReLU_ip0
- INFO - 19:53:16 - act_ReLU_ip0 <- ip0
- INFO - 19:53:16 - act_ReLU_ip0 -> ip0 (in-place)
- INFO - 19:53:16 - Setting up act_ReLU_ip0
- INFO - 19:53:16 - Top shape: 290 200 (58000)
- INFO - 19:53:16 - Memory required for data: 21911240
- INFO - 19:53:16 - Creating layer / name=fc_ip0 / type=InnerProduct
- INFO - 19:53:16 - Creating Layer fc_ip0
- INFO - 19:53:16 - fc_ip0 <- ip0
- INFO - 19:53:16 - fc_ip0 -> ip1
- INFO - 19:53:16 - Setting up fc_ip0
- INFO - 19:53:16 - Top shape: 290 200 (58000)
- INFO - 19:53:16 - Memory required for data: 22143240
- INFO - 19:53:16 - Creating layer / name=act_ReLU_ip1 / type=ReLU
- INFO - 19:53:16 - Creating Layer act_ReLU_ip1
- INFO - 19:53:16 - act_ReLU_ip1 <- ip1
- INFO - 19:53:16 - act_ReLU_ip1 -> ip1 (in-place)
- INFO - 19:53:16 - Setting up act_ReLU_ip1
- INFO - 19:53:16 - Top shape: 290 200 (58000)
- INFO - 19:53:16 - Memory required for data: 22375240
- INFO - 19:53:16 - Creating layer / name=fc_ip1 / type=InnerProduct
- INFO - 19:53:16 - Creating Layer fc_ip1
- INFO - 19:53:16 - fc_ip1 <- ip1
- INFO - 19:53:16 - fc_ip1 -> ip_losst
- INFO - 19:53:16 - Setting up fc_ip1
- INFO - 19:53:16 - Top shape: 290 20 (5800)
- INFO - 19:53:16 - Memory required for data: 22398440
- INFO - 19:53:16 - Creating layer / name=probt / type=Softmax
- INFO - 19:53:16 - Creating Layer probt
- INFO - 19:53:16 - probt <- ip_losst
- INFO - 19:53:16 - probt -> losst
- INFO - 19:53:16 - Setting up probt
- INFO - 19:53:16 - Top shape: 290 20 (5800)
- INFO - 19:53:16 - Memory required for data: 22421640
- INFO - 19:53:16 - probt does not need backward computation.
- INFO - 19:53:16 - fc_ip1 does not need backward computation.
- INFO - 19:53:16 - act_ReLU_ip1 does not need backward computation.
- INFO - 19:53:16 - fc_ip0 does not need backward computation.
- INFO - 19:53:16 - act_ReLU_ip0 does not need backward computation.
- INFO - 19:53:16 - fc_data does not need backward computation.
- INFO - 19:53:16 - inputl does not need backward computation.
- INFO - 19:53:16 - This network produces output label
- INFO - 19:53:16 - This network produces output losst
- INFO - 19:53:16 - Network initialization done.
- I0318 19:53:16.788844 9367 caffelib.cc:750] filling up net prior to training
- INFO - 19:53:16 - Solver scaffolding done.
- ERROR - 19:53:19 - service n20 training status call failed
- ERROR - 19:53:19 - {"code":500,"msg":"InternalError","dd_code":1007,"dd_msg":"src/caffe/syncedmem.cpp:56 / Check failed (custom): (error) == (cudaSuccess)"}
- INFO - 19:53:19 - Sun Mar 18 19:53:19 2018 UTC - 127.0.0.1 "GET /train?service=n20&job=1" 200 0
Add Comment
Please, Sign In to add comment