Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- (minerva_venv) patryk.miziula@pascal01:/mnt/ml-team/homes/patryk.miziula/Minerva/minerva$ CUDA_VISIBLE_DEVICES=2 python run_minerva.py -- dry_run --problem fashion_mnist
- 2018-01-30 17-10-26 minerva >>> starting experiment...
- Using TensorFlow backend.
- 2018-01-30 17-10-27 minerva >>> running: None
- neptune: Executing in Offline Mode.
- 2018-01-30 17-10-27 minerva >>> Saving graph to output/path_to_your_solution/class_predictions_graph.json
- 2018-01-30 17-10-28 minerva >>> step input unpacking inputs
- 2018-01-30 17-10-28 minerva >>> step input loading...
- 2018-01-30 17-10-28 minerva >>> step input transforming...
- 2018-01-30 17-10-28 minerva >>> step keras_model unpacking inputs
- Epoch 1/200
- 2018-01-30 17:10:28.435787: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use SSE4.1 instructions, but these are available on your machine and could speed up CPU computations.
- 2018-01-30 17:10:28.435811: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use SSE4.2 instructions, but these are available on your machine and could speed up CPU computations.
- 2018-01-30 17:10:28.435816: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX instructions, but these are available on your machine and could speed up CPU computations.
- 2018-01-30 17:10:28.435821: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX2 instructions, but these are available on your machine and could speed up CPU computations.
- 2018-01-30 17:10:28.435826: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use FMA instructions, but these are available on your machine and could speed up CPU computations.
- 2018-01-30 17:10:31.195282: I tensorflow/core/common_runtime/gpu/gpu_device.cc:940] Found device 0 with properties:
- name: GeForce GTX 1070
- major: 6 minor: 1 memoryClockRate (GHz) 1.721
- pciBusID 0000:09:00.0
- Total memory: 7.92GiB
- Free memory: 7.83GiB
- 2018-01-30 17:10:31.195309: I tensorflow/core/common_runtime/gpu/gpu_device.cc:961] DMA: 0
- 2018-01-30 17:10:31.195315: I tensorflow/core/common_runtime/gpu/gpu_device.cc:971] 0: Y
- 2018-01-30 17:10:31.195322: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1030] Creating TensorFlow device (/gpu:0) -> (device: 0, name: GeForce GTX 1070, pci bus id: 0000:09:00.0)
- 188/188 [==============================] - 11s 57ms/step - loss: 1.7692 - acc: 0.3435 - val_loss: 0.9480 - val_acc: 0.6308
- Epoch 2/200
- 188/188 [==============================] - 7s 38ms/step - loss: 0.9336 - acc: 0.6457 - val_loss: 0.6975 - val_acc: 0.7243
- Epoch 3/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.7753 - acc: 0.7008 - val_loss: 0.6136 - val_acc: 0.7528
- Epoch 4/200
- 188/188 [==============================] - 6s 35ms/step - loss: 0.7086 - acc: 0.7254 - val_loss: 0.5709 - val_acc: 0.7702
- Epoch 5/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.6619 - acc: 0.7417 - val_loss: 0.5489 - val_acc: 0.7899
- Epoch 6/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.6254 - acc: 0.7585 - val_loss: 0.5142 - val_acc: 0.7982
- Epoch 7/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.6018 - acc: 0.7690 - val_loss: 0.4885 - val_acc: 0.8122
- Epoch 8/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.5769 - acc: 0.7788 - val_loss: 0.4948 - val_acc: 0.8119
- Epoch 9/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.5581 - acc: 0.7849 - val_loss: 0.4590 - val_acc: 0.8235
- Epoch 10/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.5459 - acc: 0.7922 - val_loss: 0.4569 - val_acc: 0.8238
- Epoch 11/200
- 188/188 [==============================] - 7s 38ms/step - loss: 0.5315 - acc: 0.7969 - val_loss: 0.4402 - val_acc: 0.8330
- Epoch 12/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.5167 - acc: 0.8037 - val_loss: 0.4322 - val_acc: 0.8368
- Epoch 13/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.5012 - acc: 0.8111 - val_loss: 0.4068 - val_acc: 0.8466
- Epoch 14/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.4870 - acc: 0.8169 - val_loss: 0.4149 - val_acc: 0.8425
- Epoch 15/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.4864 - acc: 0.8185 - val_loss: 0.3933 - val_acc: 0.8515
- Epoch 16/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.4695 - acc: 0.8233 - val_loss: 0.3903 - val_acc: 0.8564
- Epoch 17/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.4577 - acc: 0.8268 - val_loss: 0.3909 - val_acc: 0.8532
- Epoch 18/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.4509 - acc: 0.8325 - val_loss: 0.3758 - val_acc: 0.8586
- Epoch 19/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.4418 - acc: 0.8334 - val_loss: 0.3662 - val_acc: 0.8627
- Epoch 20/200
- 188/188 [==============================] - 7s 38ms/step - loss: 0.4341 - acc: 0.8382 - val_loss: 0.3587 - val_acc: 0.8660
- Epoch 21/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.4288 - acc: 0.8393 - val_loss: 0.3530 - val_acc: 0.8659
- Epoch 22/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.4222 - acc: 0.8427 - val_loss: 0.3490 - val_acc: 0.8698
- Epoch 23/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.4179 - acc: 0.8468 - val_loss: 0.3451 - val_acc: 0.8716
- Epoch 24/200
- 188/188 [==============================] - 7s 39ms/step - loss: 0.4129 - acc: 0.8465 - val_loss: 0.3434 - val_acc: 0.8720
- Epoch 25/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.4054 - acc: 0.8510 - val_loss: 0.3393 - val_acc: 0.8697
- Epoch 26/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.3990 - acc: 0.8528 - val_loss: 0.3281 - val_acc: 0.8763
- Epoch 27/200
- 188/188 [==============================] - 7s 38ms/step - loss: 0.3954 - acc: 0.8538 - val_loss: 0.3241 - val_acc: 0.8787
- Epoch 28/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.3899 - acc: 0.8547 - val_loss: 0.3219 - val_acc: 0.8788
- Epoch 29/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.3861 - acc: 0.8560 - val_loss: 0.3100 - val_acc: 0.8847
- Epoch 30/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.3844 - acc: 0.8586 - val_loss: 0.3066 - val_acc: 0.8857
- Epoch 31/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.3782 - acc: 0.8605 - val_loss: 0.3103 - val_acc: 0.8839
- Epoch 32/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.3746 - acc: 0.8611 - val_loss: 0.3055 - val_acc: 0.8883
- Epoch 33/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.3680 - acc: 0.8644 - val_loss: 0.3192 - val_acc: 0.8808
- Epoch 34/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.3643 - acc: 0.8662 - val_loss: 0.3020 - val_acc: 0.8877
- Epoch 35/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.3603 - acc: 0.8660 - val_loss: 0.2974 - val_acc: 0.8890
- Epoch 36/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.3595 - acc: 0.8667 - val_loss: 0.3025 - val_acc: 0.8922
- Epoch 37/200
- 188/188 [==============================] - 7s 38ms/step - loss: 0.3561 - acc: 0.8690 - val_loss: 0.3020 - val_acc: 0.8896
- Epoch 38/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.3503 - acc: 0.8710 - val_loss: 0.2977 - val_acc: 0.8889
- Epoch 39/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.3495 - acc: 0.8709 - val_loss: 0.2985 - val_acc: 0.8900
- Epoch 40/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.3476 - acc: 0.8731 - val_loss: 0.2979 - val_acc: 0.8893
- Epoch 41/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.3464 - acc: 0.8717 - val_loss: 0.2962 - val_acc: 0.8886
- Epoch 42/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.3396 - acc: 0.8747 - val_loss: 0.2969 - val_acc: 0.8891
- Epoch 43/200
- 188/188 [==============================] - 7s 38ms/step - loss: 0.3360 - acc: 0.8757 - val_loss: 0.2813 - val_acc: 0.8958
- Epoch 44/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.3365 - acc: 0.8757 - val_loss: 0.2960 - val_acc: 0.8870
- Epoch 45/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.3345 - acc: 0.8759 - val_loss: 0.2781 - val_acc: 0.8962
- Epoch 46/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.3324 - acc: 0.8771 - val_loss: 0.2826 - val_acc: 0.8960
- Epoch 47/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.3331 - acc: 0.8776 - val_loss: 0.2857 - val_acc: 0.8932
- Epoch 48/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.3237 - acc: 0.8817 - val_loss: 0.2714 - val_acc: 0.9000
- Epoch 49/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.3264 - acc: 0.8799 - val_loss: 0.2814 - val_acc: 0.8963
- Epoch 50/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.3188 - acc: 0.8837 - val_loss: 0.2725 - val_acc: 0.8987
- Epoch 51/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.3186 - acc: 0.8814 - val_loss: 0.2694 - val_acc: 0.9007
- Epoch 52/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.3165 - acc: 0.8842 - val_loss: 0.2686 - val_acc: 0.9016
- Epoch 53/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.3173 - acc: 0.8821 - val_loss: 0.2633 - val_acc: 0.9022
- Epoch 54/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.3135 - acc: 0.8850 - val_loss: 0.2794 - val_acc: 0.8973
- Epoch 55/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.3108 - acc: 0.8862 - val_loss: 0.2597 - val_acc: 0.9042
- Epoch 56/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.3118 - acc: 0.8839 - val_loss: 0.2621 - val_acc: 0.9039
- Epoch 57/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.3074 - acc: 0.8884 - val_loss: 0.2611 - val_acc: 0.9045
- Epoch 58/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.3077 - acc: 0.8866 - val_loss: 0.2611 - val_acc: 0.9025
- Epoch 59/200
- 188/188 [==============================] - 7s 38ms/step - loss: 0.3058 - acc: 0.8863 - val_loss: 0.2581 - val_acc: 0.9038
- Epoch 60/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.3051 - acc: 0.8881 - val_loss: 0.2678 - val_acc: 0.8989
- Epoch 61/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.3007 - acc: 0.8891 - val_loss: 0.2560 - val_acc: 0.9077
- Epoch 62/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.3009 - acc: 0.8885 - val_loss: 0.2591 - val_acc: 0.9048
- Epoch 63/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2993 - acc: 0.8887 - val_loss: 0.2525 - val_acc: 0.9080
- Epoch 64/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.2950 - acc: 0.8907 - val_loss: 0.2619 - val_acc: 0.9020
- Epoch 65/200
- 188/188 [==============================] - 7s 38ms/step - loss: 0.2945 - acc: 0.8931 - val_loss: 0.2547 - val_acc: 0.9074
- Epoch 66/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.2958 - acc: 0.8910 - val_loss: 0.2525 - val_acc: 0.9053
- Epoch 67/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.2943 - acc: 0.8919 - val_loss: 0.2551 - val_acc: 0.9066
- Epoch 68/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.2909 - acc: 0.8932 - val_loss: 0.2546 - val_acc: 0.9061
- Epoch 69/200
- 188/188 [==============================] - 6s 34ms/step - loss: 0.2901 - acc: 0.8947 - val_loss: 0.2448 - val_acc: 0.9111
- Epoch 70/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2885 - acc: 0.8932 - val_loss: 0.2491 - val_acc: 0.9098
- Epoch 71/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2883 - acc: 0.8946 - val_loss: 0.2464 - val_acc: 0.9092
- Epoch 72/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2865 - acc: 0.8936 - val_loss: 0.2515 - val_acc: 0.9066
- Epoch 73/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2843 - acc: 0.8962 - val_loss: 0.2480 - val_acc: 0.9090
- Epoch 74/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.2847 - acc: 0.8948 - val_loss: 0.2466 - val_acc: 0.9101
- Epoch 75/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2808 - acc: 0.8969 - val_loss: 0.2543 - val_acc: 0.9062
- Epoch 76/200
- 188/188 [==============================] - 6s 35ms/step - loss: 0.2824 - acc: 0.8968 - val_loss: 0.2571 - val_acc: 0.9047
- Epoch 77/200
- 188/188 [==============================] - 7s 38ms/step - loss: 0.2797 - acc: 0.8966 - val_loss: 0.2460 - val_acc: 0.9064
- Epoch 78/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2788 - acc: 0.8977 - val_loss: 0.2445 - val_acc: 0.9097
- Epoch 79/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.2778 - acc: 0.8978 - val_loss: 0.2477 - val_acc: 0.9092
- Epoch 80/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.2745 - acc: 0.8993 - val_loss: 0.2395 - val_acc: 0.9119
- Epoch 81/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2752 - acc: 0.8985 - val_loss: 0.2451 - val_acc: 0.9092
- Epoch 82/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.2779 - acc: 0.8983 - val_loss: 0.2378 - val_acc: 0.9116
- Epoch 83/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.2742 - acc: 0.8992 - val_loss: 0.2395 - val_acc: 0.9115
- Epoch 84/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.2709 - acc: 0.9002 - val_loss: 0.2406 - val_acc: 0.9102
- Epoch 85/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2686 - acc: 0.9009 - val_loss: 0.2385 - val_acc: 0.9147
- Epoch 86/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2678 - acc: 0.9012 - val_loss: 0.2377 - val_acc: 0.9105
- Epoch 87/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.2691 - acc: 0.9015 - val_loss: 0.2423 - val_acc: 0.9093
- Epoch 88/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.2666 - acc: 0.9017 - val_loss: 0.2320 - val_acc: 0.9136
- Epoch 89/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.2683 - acc: 0.9019 - val_loss: 0.2362 - val_acc: 0.9125
- Epoch 90/200
- 188/188 [==============================] - 7s 38ms/step - loss: 0.2631 - acc: 0.9038 - val_loss: 0.2301 - val_acc: 0.9161
- Epoch 91/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.2623 - acc: 0.9035 - val_loss: 0.2337 - val_acc: 0.9147
- Epoch 92/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.2642 - acc: 0.9026 - val_loss: 0.2348 - val_acc: 0.9127
- Epoch 93/200
- 188/188 [==============================] - 6s 34ms/step - loss: 0.2610 - acc: 0.9030 - val_loss: 0.2347 - val_acc: 0.9120
- Epoch 94/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2606 - acc: 0.9047 - val_loss: 0.2295 - val_acc: 0.9175
- Epoch 95/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2622 - acc: 0.9040 - val_loss: 0.2393 - val_acc: 0.9128
- Epoch 96/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.2616 - acc: 0.9045 - val_loss: 0.2367 - val_acc: 0.9122
- Epoch 97/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.2570 - acc: 0.9049 - val_loss: 0.2385 - val_acc: 0.9127
- Epoch 98/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.2545 - acc: 0.9075 - val_loss: 0.2428 - val_acc: 0.9109
- Epoch 99/200
- 188/188 [==============================] - 7s 38ms/step - loss: 0.2569 - acc: 0.9059 - val_loss: 0.2325 - val_acc: 0.9143
- Epoch 100/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.2559 - acc: 0.9045 - val_loss: 0.2292 - val_acc: 0.9146
- Epoch 101/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.2551 - acc: 0.9059 - val_loss: 0.2260 - val_acc: 0.9170
- Epoch 102/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2521 - acc: 0.9071 - val_loss: 0.2380 - val_acc: 0.9132
- Epoch 103/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.2519 - acc: 0.9066 - val_loss: 0.2305 - val_acc: 0.9164
- Epoch 104/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.2528 - acc: 0.9076 - val_loss: 0.2306 - val_acc: 0.9162
- Epoch 105/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2502 - acc: 0.9067 - val_loss: 0.2290 - val_acc: 0.9157
- Epoch 106/200
- 188/188 [==============================] - 7s 39ms/step - loss: 0.2478 - acc: 0.9102 - val_loss: 0.2234 - val_acc: 0.9179
- Epoch 107/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2469 - acc: 0.9097 - val_loss: 0.2316 - val_acc: 0.9155
- Epoch 108/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2477 - acc: 0.9082 - val_loss: 0.2321 - val_acc: 0.9148
- Epoch 109/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2464 - acc: 0.9088 - val_loss: 0.2260 - val_acc: 0.9166
- Epoch 110/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.2454 - acc: 0.9089 - val_loss: 0.2246 - val_acc: 0.9184
- Epoch 111/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2453 - acc: 0.9101 - val_loss: 0.2300 - val_acc: 0.9152
- Epoch 112/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2456 - acc: 0.9097 - val_loss: 0.2271 - val_acc: 0.9177
- Epoch 113/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2402 - acc: 0.9115 - val_loss: 0.2252 - val_acc: 0.9164
- Epoch 114/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2401 - acc: 0.9108 - val_loss: 0.2239 - val_acc: 0.9177
- Epoch 115/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2422 - acc: 0.9115 - val_loss: 0.2191 - val_acc: 0.9173
- Epoch 116/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.2395 - acc: 0.9113 - val_loss: 0.2253 - val_acc: 0.9172
- Epoch 117/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2404 - acc: 0.9117 - val_loss: 0.2239 - val_acc: 0.9164
- Epoch 118/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2406 - acc: 0.9111 - val_loss: 0.2234 - val_acc: 0.9191
- Epoch 119/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2381 - acc: 0.9114 - val_loss: 0.2309 - val_acc: 0.9171
- Epoch 120/200
- 188/188 [==============================] - 6s 34ms/step - loss: 0.2360 - acc: 0.9126 - val_loss: 0.2184 - val_acc: 0.9207
- Epoch 121/200
- 188/188 [==============================] - 7s 38ms/step - loss: 0.2359 - acc: 0.9131 - val_loss: 0.2213 - val_acc: 0.9175
- Epoch 122/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.2346 - acc: 0.9121 - val_loss: 0.2267 - val_acc: 0.9170
- Epoch 123/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2331 - acc: 0.9135 - val_loss: 0.2177 - val_acc: 0.9191
- Epoch 124/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.2323 - acc: 0.9147 - val_loss: 0.2204 - val_acc: 0.9208
- Epoch 125/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.2324 - acc: 0.9146 - val_loss: 0.2259 - val_acc: 0.9161
- Epoch 126/200
- 188/188 [==============================] - 7s 37ms/step - loss: 0.2321 - acc: 0.9143 - val_loss: 0.2247 - val_acc: 0.9153
- Epoch 127/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.2294 - acc: 0.9150 - val_loss: 0.2237 - val_acc: 0.9196
- Epoch 128/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.2273 - acc: 0.9154 - val_loss: 0.2240 - val_acc: 0.9171
- Epoch 129/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.2312 - acc: 0.9159 - val_loss: 0.2189 - val_acc: 0.9223
- Epoch 130/200
- 188/188 [==============================] - 7s 39ms/step - loss: 0.2271 - acc: 0.9158 - val_loss: 0.2205 - val_acc: 0.9194
- Epoch 131/200
- 188/188 [==============================] - 7s 35ms/step - loss: 0.2291 - acc: 0.9152 - val_loss: 0.2238 - val_acc: 0.9188
- Epoch 132/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2279 - acc: 0.9172 - val_loss: 0.2282 - val_acc: 0.9172
- Epoch 133/200
- 188/188 [==============================] - 7s 36ms/step - loss: 0.2286 - acc: 0.9161 - val_loss: 0.2273 - val_acc: 0.9177
- 188/188 [==============================] - 6s 31ms/step
- 2018-01-30 17-25-42 minerva >>> step keras_model saving transformer...
- Traceback (most recent call last):
- File "run_minerva.py", line 46, in <module>
- action()
- File "/home/patryk.miziula/Minerva/minerva_venv/lib/python3.5/site-packages/click/core.py", line 722, in __call__
- return self.main(*args, **kwargs)
- File "/home/patryk.miziula/Minerva/minerva_venv/lib/python3.5/site-packages/click/core.py", line 697, in main
- rv = self.invoke(ctx)
- File "/home/patryk.miziula/Minerva/minerva_venv/lib/python3.5/site-packages/click/core.py", line 1066, in invoke
- return _process_result(sub_ctx.command.invoke(sub_ctx))
- File "/home/patryk.miziula/Minerva/minerva_venv/lib/python3.5/site-packages/click/core.py", line 895, in invoke
- return ctx.invoke(self.callback, **ctx.params)
- File "/home/patryk.miziula/Minerva/minerva_venv/lib/python3.5/site-packages/click/core.py", line 535, in invoke
- return callback(*args, **kwargs)
- File "run_minerva.py", line 27, in dry_run
- pm.dry_run(sub_problem, train_mode, dev_mode, cloud_mode)
- File "/mnt/ml-team/homes/patryk.miziula/Minerva/minerva/minerva/fashion_mnist/problem_manager.py", line 16, in dry_run
- trainer.train()
- File "/mnt/ml-team/homes/patryk.miziula/Minerva/minerva/minerva/fashion_mnist/trainer.py", line 23, in train
- 'inference': False}})
- File "/mnt/ml-team/homes/patryk.miziula/Minerva/minerva/minerva/backend/base.py", line 68, in fit_transform
- step_inputs[input_step.name] = input_step.fit_transform(data)
- File "/mnt/ml-team/homes/patryk.miziula/Minerva/minerva/minerva/backend/base.py", line 74, in fit_transform
- step_output_data = self._cached_fit_transform(step_inputs)
- File "/mnt/ml-team/homes/patryk.miziula/Minerva/minerva/minerva/backend/base.py", line 86, in _cached_fit_transform
- self.transformer.save(self.cache_filepath_step_transformer)
- File "/mnt/ml-team/homes/patryk.miziula/Minerva/minerva/minerva/backend/models/keras/models_keras.py", line 50, in save
- self.model.save(filepath)
- File "/home/patryk.miziula/Minerva/minerva_venv/lib/python3.5/site-packages/keras/engine/topology.py", line 2573, in save
- save_model(self, filepath, overwrite, include_optimizer)
- File "/home/patryk.miziula/Minerva/minerva_venv/lib/python3.5/site-packages/keras/models.py", line 60, in save_model
- raise ImportError('`save_model` requires h5py.')
- ImportError: `save_model` requires h5py.
- Sentry is attempting to send 1 pending error messages
- Waiting up to 10 seconds
- Press Ctrl-C to quit
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement