Advertisement
Guest User

Untitled

a guest
Jan 30th, 2018
117
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
Bash 22.74 KB | None | 0 0
  1. (minerva_venv) patryk.miziula@pascal01:/mnt/ml-team/homes/patryk.miziula/Minerva/minerva$ CUDA_VISIBLE_DEVICES=2 python run_minerva.py -- dry_run --problem fashion_mnist
  2. 2018-01-30 17-10-26 minerva >>> starting experiment...
  3. Using TensorFlow backend.
  4. 2018-01-30 17-10-27 minerva >>> running: None
  5. neptune: Executing in Offline Mode.
  6. 2018-01-30 17-10-27 minerva >>> Saving graph to output/path_to_your_solution/class_predictions_graph.json
  7. 2018-01-30 17-10-28 minerva >>> step input unpacking inputs
  8. 2018-01-30 17-10-28 minerva >>> step input loading...
  9. 2018-01-30 17-10-28 minerva >>> step input transforming...
  10. 2018-01-30 17-10-28 minerva >>> step keras_model unpacking inputs
  11. Epoch 1/200
  12. 2018-01-30 17:10:28.435787: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use SSE4.1 instructions, but these are available on your machine and could speed up CPU computations.
  13. 2018-01-30 17:10:28.435811: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use SSE4.2 instructions, but these are available on your machine and could speed up CPU computations.
  14. 2018-01-30 17:10:28.435816: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX instructions, but these are available on your machine and could speed up CPU computations.
  15. 2018-01-30 17:10:28.435821: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX2 instructions, but these are available on your machine and could speed up CPU computations.
  16. 2018-01-30 17:10:28.435826: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use FMA instructions, but these are available on your machine and could speed up CPU computations.
  17. 2018-01-30 17:10:31.195282: I tensorflow/core/common_runtime/gpu/gpu_device.cc:940] Found device 0 with properties:
  18. name: GeForce GTX 1070
  19. major: 6 minor: 1 memoryClockRate (GHz) 1.721
  20. pciBusID 0000:09:00.0
  21. Total memory: 7.92GiB
  22. Free memory: 7.83GiB
  23. 2018-01-30 17:10:31.195309: I tensorflow/core/common_runtime/gpu/gpu_device.cc:961] DMA: 0
  24. 2018-01-30 17:10:31.195315: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 0:   Y
  25. 2018-01-30 17:10:31.195322: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1030] Creating TensorFlow device (/gpu:0) -> (device: 0, name: GeForce GTX 1070, pci bus id: 0000:09:00.0)
  26. 188/188 [==============================] - 11s 57ms/step - loss: 1.7692 - acc: 0.3435 - val_loss: 0.9480 - val_acc: 0.6308
  27. Epoch 2/200
  28. 188/188 [==============================] - 7s 38ms/step - loss: 0.9336 - acc: 0.6457 - val_loss: 0.6975 - val_acc: 0.7243
  29. Epoch 3/200
  30. 188/188 [==============================] - 7s 37ms/step - loss: 0.7753 - acc: 0.7008 - val_loss: 0.6136 - val_acc: 0.7528
  31. Epoch 4/200
  32. 188/188 [==============================] - 6s 35ms/step - loss: 0.7086 - acc: 0.7254 - val_loss: 0.5709 - val_acc: 0.7702
  33. Epoch 5/200
  34. 188/188 [==============================] - 7s 36ms/step - loss: 0.6619 - acc: 0.7417 - val_loss: 0.5489 - val_acc: 0.7899
  35. Epoch 6/200
  36. 188/188 [==============================] - 7s 36ms/step - loss: 0.6254 - acc: 0.7585 - val_loss: 0.5142 - val_acc: 0.7982
  37. Epoch 7/200
  38. 188/188 [==============================] - 7s 36ms/step - loss: 0.6018 - acc: 0.7690 - val_loss: 0.4885 - val_acc: 0.8122
  39. Epoch 8/200
  40. 188/188 [==============================] - 7s 36ms/step - loss: 0.5769 - acc: 0.7788 - val_loss: 0.4948 - val_acc: 0.8119
  41. Epoch 9/200
  42. 188/188 [==============================] - 7s 36ms/step - loss: 0.5581 - acc: 0.7849 - val_loss: 0.4590 - val_acc: 0.8235
  43. Epoch 10/200
  44. 188/188 [==============================] - 7s 36ms/step - loss: 0.5459 - acc: 0.7922 - val_loss: 0.4569 - val_acc: 0.8238
  45. Epoch 11/200
  46. 188/188 [==============================] - 7s 38ms/step - loss: 0.5315 - acc: 0.7969 - val_loss: 0.4402 - val_acc: 0.8330
  47. Epoch 12/200
  48. 188/188 [==============================] - 7s 36ms/step - loss: 0.5167 - acc: 0.8037 - val_loss: 0.4322 - val_acc: 0.8368
  49. Epoch 13/200
  50. 188/188 [==============================] - 7s 35ms/step - loss: 0.5012 - acc: 0.8111 - val_loss: 0.4068 - val_acc: 0.8466
  51. Epoch 14/200
  52. 188/188 [==============================] - 7s 35ms/step - loss: 0.4870 - acc: 0.8169 - val_loss: 0.4149 - val_acc: 0.8425
  53. Epoch 15/200
  54. 188/188 [==============================] - 7s 36ms/step - loss: 0.4864 - acc: 0.8185 - val_loss: 0.3933 - val_acc: 0.8515
  55. Epoch 16/200
  56. 188/188 [==============================] - 7s 37ms/step - loss: 0.4695 - acc: 0.8233 - val_loss: 0.3903 - val_acc: 0.8564
  57. Epoch 17/200
  58. 188/188 [==============================] - 7s 36ms/step - loss: 0.4577 - acc: 0.8268 - val_loss: 0.3909 - val_acc: 0.8532
  59. Epoch 18/200
  60. 188/188 [==============================] - 7s 37ms/step - loss: 0.4509 - acc: 0.8325 - val_loss: 0.3758 - val_acc: 0.8586
  61. Epoch 19/200
  62. 188/188 [==============================] - 7s 37ms/step - loss: 0.4418 - acc: 0.8334 - val_loss: 0.3662 - val_acc: 0.8627
  63. Epoch 20/200
  64. 188/188 [==============================] - 7s 38ms/step - loss: 0.4341 - acc: 0.8382 - val_loss: 0.3587 - val_acc: 0.8660
  65. Epoch 21/200
  66. 188/188 [==============================] - 7s 36ms/step - loss: 0.4288 - acc: 0.8393 - val_loss: 0.3530 - val_acc: 0.8659
  67. Epoch 22/200
  68. 188/188 [==============================] - 7s 37ms/step - loss: 0.4222 - acc: 0.8427 - val_loss: 0.3490 - val_acc: 0.8698
  69. Epoch 23/200
  70. 188/188 [==============================] - 7s 36ms/step - loss: 0.4179 - acc: 0.8468 - val_loss: 0.3451 - val_acc: 0.8716
  71. Epoch 24/200
  72. 188/188 [==============================] - 7s 39ms/step - loss: 0.4129 - acc: 0.8465 - val_loss: 0.3434 - val_acc: 0.8720
  73. Epoch 25/200
  74. 188/188 [==============================] - 7s 35ms/step - loss: 0.4054 - acc: 0.8510 - val_loss: 0.3393 - val_acc: 0.8697
  75. Epoch 26/200
  76. 188/188 [==============================] - 7s 35ms/step - loss: 0.3990 - acc: 0.8528 - val_loss: 0.3281 - val_acc: 0.8763
  77. Epoch 27/200
  78. 188/188 [==============================] - 7s 38ms/step - loss: 0.3954 - acc: 0.8538 - val_loss: 0.3241 - val_acc: 0.8787
  79. Epoch 28/200
  80. 188/188 [==============================] - 7s 36ms/step - loss: 0.3899 - acc: 0.8547 - val_loss: 0.3219 - val_acc: 0.8788
  81. Epoch 29/200
  82. 188/188 [==============================] - 7s 37ms/step - loss: 0.3861 - acc: 0.8560 - val_loss: 0.3100 - val_acc: 0.8847
  83. Epoch 30/200
  84. 188/188 [==============================] - 7s 36ms/step - loss: 0.3844 - acc: 0.8586 - val_loss: 0.3066 - val_acc: 0.8857
  85. Epoch 31/200
  86. 188/188 [==============================] - 7s 36ms/step - loss: 0.3782 - acc: 0.8605 - val_loss: 0.3103 - val_acc: 0.8839
  87. Epoch 32/200
  88. 188/188 [==============================] - 7s 37ms/step - loss: 0.3746 - acc: 0.8611 - val_loss: 0.3055 - val_acc: 0.8883
  89. Epoch 33/200
  90. 188/188 [==============================] - 7s 37ms/step - loss: 0.3680 - acc: 0.8644 - val_loss: 0.3192 - val_acc: 0.8808
  91. Epoch 34/200
  92. 188/188 [==============================] - 7s 36ms/step - loss: 0.3643 - acc: 0.8662 - val_loss: 0.3020 - val_acc: 0.8877
  93. Epoch 35/200
  94. 188/188 [==============================] - 7s 37ms/step - loss: 0.3603 - acc: 0.8660 - val_loss: 0.2974 - val_acc: 0.8890
  95. Epoch 36/200
  96. 188/188 [==============================] - 7s 36ms/step - loss: 0.3595 - acc: 0.8667 - val_loss: 0.3025 - val_acc: 0.8922
  97. Epoch 37/200
  98. 188/188 [==============================] - 7s 38ms/step - loss: 0.3561 - acc: 0.8690 - val_loss: 0.3020 - val_acc: 0.8896
  99. Epoch 38/200
  100. 188/188 [==============================] - 7s 35ms/step - loss: 0.3503 - acc: 0.8710 - val_loss: 0.2977 - val_acc: 0.8889
  101. Epoch 39/200
  102. 188/188 [==============================] - 7s 36ms/step - loss: 0.3495 - acc: 0.8709 - val_loss: 0.2985 - val_acc: 0.8900
  103. Epoch 40/200
  104. 188/188 [==============================] - 7s 36ms/step - loss: 0.3476 - acc: 0.8731 - val_loss: 0.2979 - val_acc: 0.8893
  105. Epoch 41/200
  106. 188/188 [==============================] - 7s 37ms/step - loss: 0.3464 - acc: 0.8717 - val_loss: 0.2962 - val_acc: 0.8886
  107. Epoch 42/200
  108. 188/188 [==============================] - 7s 37ms/step - loss: 0.3396 - acc: 0.8747 - val_loss: 0.2969 - val_acc: 0.8891
  109. Epoch 43/200
  110. 188/188 [==============================] - 7s 38ms/step - loss: 0.3360 - acc: 0.8757 - val_loss: 0.2813 - val_acc: 0.8958
  111. Epoch 44/200
  112. 188/188 [==============================] - 7s 35ms/step - loss: 0.3365 - acc: 0.8757 - val_loss: 0.2960 - val_acc: 0.8870
  113. Epoch 45/200
  114. 188/188 [==============================] - 7s 36ms/step - loss: 0.3345 - acc: 0.8759 - val_loss: 0.2781 - val_acc: 0.8962
  115. Epoch 46/200
  116. 188/188 [==============================] - 7s 37ms/step - loss: 0.3324 - acc: 0.8771 - val_loss: 0.2826 - val_acc: 0.8960
  117. Epoch 47/200
  118. 188/188 [==============================] - 7s 35ms/step - loss: 0.3331 - acc: 0.8776 - val_loss: 0.2857 - val_acc: 0.8932
  119. Epoch 48/200
  120. 188/188 [==============================] - 7s 35ms/step - loss: 0.3237 - acc: 0.8817 - val_loss: 0.2714 - val_acc: 0.9000
  121. Epoch 49/200
  122. 188/188 [==============================] - 7s 36ms/step - loss: 0.3264 - acc: 0.8799 - val_loss: 0.2814 - val_acc: 0.8963
  123. Epoch 50/200
  124. 188/188 [==============================] - 7s 36ms/step - loss: 0.3188 - acc: 0.8837 - val_loss: 0.2725 - val_acc: 0.8987
  125. Epoch 51/200
  126. 188/188 [==============================] - 7s 37ms/step - loss: 0.3186 - acc: 0.8814 - val_loss: 0.2694 - val_acc: 0.9007
  127. Epoch 52/200
  128. 188/188 [==============================] - 7s 36ms/step - loss: 0.3165 - acc: 0.8842 - val_loss: 0.2686 - val_acc: 0.9016
  129. Epoch 53/200
  130. 188/188 [==============================] - 7s 36ms/step - loss: 0.3173 - acc: 0.8821 - val_loss: 0.2633 - val_acc: 0.9022
  131. Epoch 54/200
  132. 188/188 [==============================] - 7s 37ms/step - loss: 0.3135 - acc: 0.8850 - val_loss: 0.2794 - val_acc: 0.8973
  133. Epoch 55/200
  134. 188/188 [==============================] - 7s 37ms/step - loss: 0.3108 - acc: 0.8862 - val_loss: 0.2597 - val_acc: 0.9042
  135. Epoch 56/200
  136. 188/188 [==============================] - 7s 35ms/step - loss: 0.3118 - acc: 0.8839 - val_loss: 0.2621 - val_acc: 0.9039
  137. Epoch 57/200
  138. 188/188 [==============================] - 7s 36ms/step - loss: 0.3074 - acc: 0.8884 - val_loss: 0.2611 - val_acc: 0.9045
  139. Epoch 58/200
  140. 188/188 [==============================] - 7s 36ms/step - loss: 0.3077 - acc: 0.8866 - val_loss: 0.2611 - val_acc: 0.9025
  141. Epoch 59/200
  142. 188/188 [==============================] - 7s 38ms/step - loss: 0.3058 - acc: 0.8863 - val_loss: 0.2581 - val_acc: 0.9038
  143. Epoch 60/200
  144. 188/188 [==============================] - 7s 35ms/step - loss: 0.3051 - acc: 0.8881 - val_loss: 0.2678 - val_acc: 0.8989
  145. Epoch 61/200
  146. 188/188 [==============================] - 7s 37ms/step - loss: 0.3007 - acc: 0.8891 - val_loss: 0.2560 - val_acc: 0.9077
  147. Epoch 62/200
  148. 188/188 [==============================] - 7s 37ms/step - loss: 0.3009 - acc: 0.8885 - val_loss: 0.2591 - val_acc: 0.9048
  149. Epoch 63/200
  150. 188/188 [==============================] - 7s 36ms/step - loss: 0.2993 - acc: 0.8887 - val_loss: 0.2525 - val_acc: 0.9080
  151. Epoch 64/200
  152. 188/188 [==============================] - 7s 37ms/step - loss: 0.2950 - acc: 0.8907 - val_loss: 0.2619 - val_acc: 0.9020
  153. Epoch 65/200
  154. 188/188 [==============================] - 7s 38ms/step - loss: 0.2945 - acc: 0.8931 - val_loss: 0.2547 - val_acc: 0.9074
  155. Epoch 66/200
  156. 188/188 [==============================] - 7s 35ms/step - loss: 0.2958 - acc: 0.8910 - val_loss: 0.2525 - val_acc: 0.9053
  157. Epoch 67/200
  158. 188/188 [==============================] - 7s 35ms/step - loss: 0.2943 - acc: 0.8919 - val_loss: 0.2551 - val_acc: 0.9066
  159. Epoch 68/200
  160. 188/188 [==============================] - 7s 37ms/step - loss: 0.2909 - acc: 0.8932 - val_loss: 0.2546 - val_acc: 0.9061
  161. Epoch 69/200
  162. 188/188 [==============================] - 6s 34ms/step - loss: 0.2901 - acc: 0.8947 - val_loss: 0.2448 - val_acc: 0.9111
  163. Epoch 70/200
  164. 188/188 [==============================] - 7s 36ms/step - loss: 0.2885 - acc: 0.8932 - val_loss: 0.2491 - val_acc: 0.9098
  165. Epoch 71/200
  166. 188/188 [==============================] - 7s 36ms/step - loss: 0.2883 - acc: 0.8946 - val_loss: 0.2464 - val_acc: 0.9092
  167. Epoch 72/200
  168. 188/188 [==============================] - 7s 36ms/step - loss: 0.2865 - acc: 0.8936 - val_loss: 0.2515 - val_acc: 0.9066
  169. Epoch 73/200
  170. 188/188 [==============================] - 7s 36ms/step - loss: 0.2843 - acc: 0.8962 - val_loss: 0.2480 - val_acc: 0.9090
  171. Epoch 74/200
  172. 188/188 [==============================] - 7s 37ms/step - loss: 0.2847 - acc: 0.8948 - val_loss: 0.2466 - val_acc: 0.9101
  173. Epoch 75/200
  174. 188/188 [==============================] - 7s 36ms/step - loss: 0.2808 - acc: 0.8969 - val_loss: 0.2543 - val_acc: 0.9062
  175. Epoch 76/200
  176. 188/188 [==============================] - 6s 35ms/step - loss: 0.2824 - acc: 0.8968 - val_loss: 0.2571 - val_acc: 0.9047
  177. Epoch 77/200
  178. 188/188 [==============================] - 7s 38ms/step - loss: 0.2797 - acc: 0.8966 - val_loss: 0.2460 - val_acc: 0.9064
  179. Epoch 78/200
  180. 188/188 [==============================] - 7s 36ms/step - loss: 0.2788 - acc: 0.8977 - val_loss: 0.2445 - val_acc: 0.9097
  181. Epoch 79/200
  182. 188/188 [==============================] - 7s 35ms/step - loss: 0.2778 - acc: 0.8978 - val_loss: 0.2477 - val_acc: 0.9092
  183. Epoch 80/200
  184. 188/188 [==============================] - 7s 35ms/step - loss: 0.2745 - acc: 0.8993 - val_loss: 0.2395 - val_acc: 0.9119
  185. Epoch 81/200
  186. 188/188 [==============================] - 7s 36ms/step - loss: 0.2752 - acc: 0.8985 - val_loss: 0.2451 - val_acc: 0.9092
  187. Epoch 82/200
  188. 188/188 [==============================] - 7s 37ms/step - loss: 0.2779 - acc: 0.8983 - val_loss: 0.2378 - val_acc: 0.9116
  189. Epoch 83/200
  190. 188/188 [==============================] - 7s 35ms/step - loss: 0.2742 - acc: 0.8992 - val_loss: 0.2395 - val_acc: 0.9115
  191. Epoch 84/200
  192. 188/188 [==============================] - 7s 35ms/step - loss: 0.2709 - acc: 0.9002 - val_loss: 0.2406 - val_acc: 0.9102
  193. Epoch 85/200
  194. 188/188 [==============================] - 7s 36ms/step - loss: 0.2686 - acc: 0.9009 - val_loss: 0.2385 - val_acc: 0.9147
  195. Epoch 86/200
  196. 188/188 [==============================] - 7s 36ms/step - loss: 0.2678 - acc: 0.9012 - val_loss: 0.2377 - val_acc: 0.9105
  197. Epoch 87/200
  198. 188/188 [==============================] - 7s 35ms/step - loss: 0.2691 - acc: 0.9015 - val_loss: 0.2423 - val_acc: 0.9093
  199. Epoch 88/200
  200. 188/188 [==============================] - 7s 35ms/step - loss: 0.2666 - acc: 0.9017 - val_loss: 0.2320 - val_acc: 0.9136
  201. Epoch 89/200
  202. 188/188 [==============================] - 7s 37ms/step - loss: 0.2683 - acc: 0.9019 - val_loss: 0.2362 - val_acc: 0.9125
  203. Epoch 90/200
  204. 188/188 [==============================] - 7s 38ms/step - loss: 0.2631 - acc: 0.9038 - val_loss: 0.2301 - val_acc: 0.9161
  205. Epoch 91/200
  206. 188/188 [==============================] - 7s 37ms/step - loss: 0.2623 - acc: 0.9035 - val_loss: 0.2337 - val_acc: 0.9147
  207. Epoch 92/200
  208. 188/188 [==============================] - 7s 35ms/step - loss: 0.2642 - acc: 0.9026 - val_loss: 0.2348 - val_acc: 0.9127
  209. Epoch 93/200
  210. 188/188 [==============================] - 6s 34ms/step - loss: 0.2610 - acc: 0.9030 - val_loss: 0.2347 - val_acc: 0.9120
  211. Epoch 94/200
  212. 188/188 [==============================] - 7s 36ms/step - loss: 0.2606 - acc: 0.9047 - val_loss: 0.2295 - val_acc: 0.9175
  213. Epoch 95/200
  214. 188/188 [==============================] - 7s 36ms/step - loss: 0.2622 - acc: 0.9040 - val_loss: 0.2393 - val_acc: 0.9128
  215. Epoch 96/200
  216. 188/188 [==============================] - 7s 35ms/step - loss: 0.2616 - acc: 0.9045 - val_loss: 0.2367 - val_acc: 0.9122
  217. Epoch 97/200
  218. 188/188 [==============================] - 7s 37ms/step - loss: 0.2570 - acc: 0.9049 - val_loss: 0.2385 - val_acc: 0.9127
  219. Epoch 98/200
  220. 188/188 [==============================] - 7s 35ms/step - loss: 0.2545 - acc: 0.9075 - val_loss: 0.2428 - val_acc: 0.9109
  221. Epoch 99/200
  222. 188/188 [==============================] - 7s 38ms/step - loss: 0.2569 - acc: 0.9059 - val_loss: 0.2325 - val_acc: 0.9143
  223. Epoch 100/200
  224. 188/188 [==============================] - 7s 35ms/step - loss: 0.2559 - acc: 0.9045 - val_loss: 0.2292 - val_acc: 0.9146
  225. Epoch 101/200
  226. 188/188 [==============================] - 7s 37ms/step - loss: 0.2551 - acc: 0.9059 - val_loss: 0.2260 - val_acc: 0.9170
  227. Epoch 102/200
  228. 188/188 [==============================] - 7s 36ms/step - loss: 0.2521 - acc: 0.9071 - val_loss: 0.2380 - val_acc: 0.9132
  229. Epoch 103/200
  230. 188/188 [==============================] - 7s 35ms/step - loss: 0.2519 - acc: 0.9066 - val_loss: 0.2305 - val_acc: 0.9164
  231. Epoch 104/200
  232. 188/188 [==============================] - 7s 37ms/step - loss: 0.2528 - acc: 0.9076 - val_loss: 0.2306 - val_acc: 0.9162
  233. Epoch 105/200
  234. 188/188 [==============================] - 7s 36ms/step - loss: 0.2502 - acc: 0.9067 - val_loss: 0.2290 - val_acc: 0.9157
  235. Epoch 106/200
  236. 188/188 [==============================] - 7s 39ms/step - loss: 0.2478 - acc: 0.9102 - val_loss: 0.2234 - val_acc: 0.9179
  237. Epoch 107/200
  238. 188/188 [==============================] - 7s 36ms/step - loss: 0.2469 - acc: 0.9097 - val_loss: 0.2316 - val_acc: 0.9155
  239. Epoch 108/200
  240. 188/188 [==============================] - 7s 36ms/step - loss: 0.2477 - acc: 0.9082 - val_loss: 0.2321 - val_acc: 0.9148
  241. Epoch 109/200
  242. 188/188 [==============================] - 7s 36ms/step - loss: 0.2464 - acc: 0.9088 - val_loss: 0.2260 - val_acc: 0.9166
  243. Epoch 110/200
  244. 188/188 [==============================] - 7s 37ms/step - loss: 0.2454 - acc: 0.9089 - val_loss: 0.2246 - val_acc: 0.9184
  245. Epoch 111/200
  246. 188/188 [==============================] - 7s 36ms/step - loss: 0.2453 - acc: 0.9101 - val_loss: 0.2300 - val_acc: 0.9152
  247. Epoch 112/200
  248. 188/188 [==============================] - 7s 36ms/step - loss: 0.2456 - acc: 0.9097 - val_loss: 0.2271 - val_acc: 0.9177
  249. Epoch 113/200
  250. 188/188 [==============================] - 7s 36ms/step - loss: 0.2402 - acc: 0.9115 - val_loss: 0.2252 - val_acc: 0.9164
  251. Epoch 114/200
  252. 188/188 [==============================] - 7s 36ms/step - loss: 0.2401 - acc: 0.9108 - val_loss: 0.2239 - val_acc: 0.9177
  253. Epoch 115/200
  254. 188/188 [==============================] - 7s 36ms/step - loss: 0.2422 - acc: 0.9115 - val_loss: 0.2191 - val_acc: 0.9173
  255. Epoch 116/200
  256. 188/188 [==============================] - 7s 35ms/step - loss: 0.2395 - acc: 0.9113 - val_loss: 0.2253 - val_acc: 0.9172
  257. Epoch 117/200
  258. 188/188 [==============================] - 7s 36ms/step - loss: 0.2404 - acc: 0.9117 - val_loss: 0.2239 - val_acc: 0.9164
  259. Epoch 118/200
  260. 188/188 [==============================] - 7s 36ms/step - loss: 0.2406 - acc: 0.9111 - val_loss: 0.2234 - val_acc: 0.9191
  261. Epoch 119/200
  262. 188/188 [==============================] - 7s 36ms/step - loss: 0.2381 - acc: 0.9114 - val_loss: 0.2309 - val_acc: 0.9171
  263. Epoch 120/200
  264. 188/188 [==============================] - 6s 34ms/step - loss: 0.2360 - acc: 0.9126 - val_loss: 0.2184 - val_acc: 0.9207
  265. Epoch 121/200
  266. 188/188 [==============================] - 7s 38ms/step - loss: 0.2359 - acc: 0.9131 - val_loss: 0.2213 - val_acc: 0.9175
  267. Epoch 122/200
  268. 188/188 [==============================] - 7s 35ms/step - loss: 0.2346 - acc: 0.9121 - val_loss: 0.2267 - val_acc: 0.9170
  269. Epoch 123/200
  270. 188/188 [==============================] - 7s 36ms/step - loss: 0.2331 - acc: 0.9135 - val_loss: 0.2177 - val_acc: 0.9191
  271. Epoch 124/200
  272. 188/188 [==============================] - 7s 35ms/step - loss: 0.2323 - acc: 0.9147 - val_loss: 0.2204 - val_acc: 0.9208
  273. Epoch 125/200
  274. 188/188 [==============================] - 7s 35ms/step - loss: 0.2324 - acc: 0.9146 - val_loss: 0.2259 - val_acc: 0.9161
  275. Epoch 126/200
  276. 188/188 [==============================] - 7s 37ms/step - loss: 0.2321 - acc: 0.9143 - val_loss: 0.2247 - val_acc: 0.9153
  277. Epoch 127/200
  278. 188/188 [==============================] - 7s 35ms/step - loss: 0.2294 - acc: 0.9150 - val_loss: 0.2237 - val_acc: 0.9196
  279. Epoch 128/200
  280. 188/188 [==============================] - 7s 35ms/step - loss: 0.2273 - acc: 0.9154 - val_loss: 0.2240 - val_acc: 0.9171
  281. Epoch 129/200
  282. 188/188 [==============================] - 7s 35ms/step - loss: 0.2312 - acc: 0.9159 - val_loss: 0.2189 - val_acc: 0.9223
  283. Epoch 130/200
  284. 188/188 [==============================] - 7s 39ms/step - loss: 0.2271 - acc: 0.9158 - val_loss: 0.2205 - val_acc: 0.9194
  285. Epoch 131/200
  286. 188/188 [==============================] - 7s 35ms/step - loss: 0.2291 - acc: 0.9152 - val_loss: 0.2238 - val_acc: 0.9188
  287. Epoch 132/200
  288. 188/188 [==============================] - 7s 36ms/step - loss: 0.2279 - acc: 0.9172 - val_loss: 0.2282 - val_acc: 0.9172
  289. Epoch 133/200
  290. 188/188 [==============================] - 7s 36ms/step - loss: 0.2286 - acc: 0.9161 - val_loss: 0.2273 - val_acc: 0.9177
  291. 188/188 [==============================] - 6s 31ms/step
  292. 2018-01-30 17-25-42 minerva >>> step keras_model saving transformer...
  293. Traceback (most recent call last):
  294.  File "run_minerva.py", line 46, in <module>
  295.    action()
  296.  File "/home/patryk.miziula/Minerva/minerva_venv/lib/python3.5/site-packages/click/core.py", line 722, in __call__
  297.    return self.main(*args, **kwargs)
  298.  File "/home/patryk.miziula/Minerva/minerva_venv/lib/python3.5/site-packages/click/core.py", line 697, in main
  299.    rv = self.invoke(ctx)
  300.  File "/home/patryk.miziula/Minerva/minerva_venv/lib/python3.5/site-packages/click/core.py", line 1066, in invoke
  301.    return _process_result(sub_ctx.command.invoke(sub_ctx))
  302.  File "/home/patryk.miziula/Minerva/minerva_venv/lib/python3.5/site-packages/click/core.py", line 895, in invoke
  303.    return ctx.invoke(self.callback, **ctx.params)
  304.  File "/home/patryk.miziula/Minerva/minerva_venv/lib/python3.5/site-packages/click/core.py", line 535, in invoke
  305.    return callback(*args, **kwargs)
  306.  File "run_minerva.py", line 27, in dry_run
  307.    pm.dry_run(sub_problem, train_mode, dev_mode, cloud_mode)
  308.  File "/mnt/ml-team/homes/patryk.miziula/Minerva/minerva/minerva/fashion_mnist/problem_manager.py", line 16, in dry_run
  309.    trainer.train()
  310.  File "/mnt/ml-team/homes/patryk.miziula/Minerva/minerva/minerva/fashion_mnist/trainer.py", line 23, in train
  311.    'inference': False}})
  312.  File "/mnt/ml-team/homes/patryk.miziula/Minerva/minerva/minerva/backend/base.py", line 68, in fit_transform
  313.    step_inputs[input_step.name] = input_step.fit_transform(data)
  314.  File "/mnt/ml-team/homes/patryk.miziula/Minerva/minerva/minerva/backend/base.py", line 74, in fit_transform
  315.    step_output_data = self._cached_fit_transform(step_inputs)
  316.  File "/mnt/ml-team/homes/patryk.miziula/Minerva/minerva/minerva/backend/base.py", line 86, in _cached_fit_transform
  317.    self.transformer.save(self.cache_filepath_step_transformer)
  318.  File "/mnt/ml-team/homes/patryk.miziula/Minerva/minerva/minerva/backend/models/keras/models_keras.py", line 50, in save
  319.    self.model.save(filepath)
  320.  File "/home/patryk.miziula/Minerva/minerva_venv/lib/python3.5/site-packages/keras/engine/topology.py", line 2573, in save
  321.    save_model(self, filepath, overwrite, include_optimizer)
  322.  File "/home/patryk.miziula/Minerva/minerva_venv/lib/python3.5/site-packages/keras/models.py", line 60, in save_model
  323.    raise ImportError('`save_model` requires h5py.')
  324. ImportError: `save_model` requires h5py.
  325. Sentry is attempting to send 1 pending error messages
  326. Waiting up to 10 seconds
  327. Press Ctrl-C to quit
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement