Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/op_def_library.py:263: colocate_with (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.
- Instructions for updating:
- Colocations handled automatically by placer.
- /usr/local/lib/python3.6/dist-packages/ipykernel_launcher.py:25: DeprecationWarning: Call to deprecated `__getitem__` (Method will be removed in 4.0.0, use self.wv.__getitem__() instead).
- /usr/local/lib/python3.6/dist-packages/ipykernel_launcher.py:30: DeprecationWarning: Call to deprecated `__getitem__` (Method will be removed in 4.0.0, use self.wv.__getitem__() instead).
- /usr/local/lib/python3.6/dist-packages/ipykernel_launcher.py:35: DeprecationWarning: Call to deprecated `__getitem__` (Method will be removed in 4.0.0, use self.wv.__getitem__() instead).
- /usr/local/lib/python3.6/dist-packages/ipykernel_launcher.py:41: DeprecationWarning: Call to deprecated `__getitem__` (Method will be removed in 4.0.0, use self.wv.__getitem__() instead).
- /usr/local/lib/python3.6/dist-packages/ipykernel_launcher.py:15: UserWarning: Update your `LSTM` call to the Keras 2 API: `LSTM(100, batch_input_shape=[None, 20,..., return_sequences=False, activation="tanh", kernel_initializer="glorot_normal", recurrent_initializer="glorot_normal")`
- from ipykernel import kernelapp as app
- _________________________________________________________________
- Layer (type) Output Shape Param #
- =================================================================
- lstm_1 (LSTM) (None, 100) 80400
- =================================================================
- Total params: 80,400
- Trainable params: 80,400
- Non-trainable params: 0
- _________________________________________________________________
- WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/math_ops.py:3066: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.
- Instructions for updating:
- Use tf.cast instead.
- /usr/local/lib/python3.6/dist-packages/ipykernel_launcher.py:19: UserWarning: The `nb_epoch` argument in `fit` has been renamed `epochs`.
- Train on 525 samples, validate on 132 samples
- Epoch 1/1000
- 525/525 [==============================] - 1s 2ms/step - loss: -0.1752 - acc: 0.0152 - val_loss: -0.2922 - val_acc: 0.0379
- Epoch 2/1000
- 525/525 [==============================] - 0s 809us/step - loss: -0.3396 - acc: 0.0305 - val_loss: -0.3316 - val_acc: 0.0152
- Epoch 3/1000
- 525/525 [==============================] - 0s 865us/step - loss: -0.3962 - acc: 0.0419 - val_loss: -0.3498 - val_acc: 0.0303
- Epoch 4/1000
- 525/525 [==============================] - 0s 823us/step - loss: -0.4471 - acc: 0.0552 - val_loss: -0.3619 - val_acc: 0.0303
- Epoch 5/1000
- 525/525 [==============================] - 0s 861us/step - loss: -0.5016 - acc: 0.0838 - val_loss: -0.3770 - val_acc: 0.0455
- Epoch 6/1000
- 525/525 [==============================] - 0s 812us/step - loss: -0.5497 - acc: 0.1219 - val_loss: -0.3861 - val_acc: 0.0606
- Epoch 7/1000
- 525/525 [==============================] - 0s 817us/step - loss: -0.5891 - acc: 0.1486 - val_loss: -0.4022 - val_acc: 0.0530
- Epoch 8/1000
- 525/525 [==============================] - 0s 834us/step - loss: -0.6213 - acc: 0.1638 - val_loss: -0.4150 - val_acc: 0.0909
- Epoch 9/1000
- 525/525 [==============================] - 0s 819us/step - loss: -0.6480 - acc: 0.1943 - val_loss: -0.4160 - val_acc: 0.1061
- Epoch 10/1000
- 525/525 [==============================] - 0s 823us/step - loss: -0.6703 - acc: 0.1905 - val_loss: -0.4312 - val_acc: 0.0833
- Epoch 11/1000
- 525/525 [==============================] - 0s 822us/step - loss: -0.6908 - acc: 0.1962 - val_loss: -0.4343 - val_acc: 0.1061
- Epoch 12/1000
- 525/525 [==============================] - 0s 821us/step - loss: -0.7066 - acc: 0.2095 - val_loss: -0.4369 - val_acc: 0.0909
- Epoch 13/1000
- 525/525 [==============================] - 0s 820us/step - loss: -0.7223 - acc: 0.2267 - val_loss: -0.4420 - val_acc: 0.0985
- Epoch 14/1000
- 525/525 [==============================] - 0s 845us/step - loss: -0.7356 - acc: 0.2286 - val_loss: -0.4470 - val_acc: 0.1212
- Epoch 15/1000
- 525/525 [==============================] - 0s 850us/step - loss: -0.7505 - acc: 0.2743 - val_loss: -0.4537 - val_acc: 0.1061
- Epoch 16/1000
- 525/525 [==============================] - 0s 814us/step - loss: -0.7649 - acc: 0.2819 - val_loss: -0.4545 - val_acc: 0.1136
- Epoch 17/1000
- 525/525 [==============================] - 0s 827us/step - loss: -0.7760 - acc: 0.2933 - val_loss: -0.4621 - val_acc: 0.1439
- Epoch 18/1000
- 525/525 [==============================] - 0s 828us/step - loss: -0.7861 - acc: 0.3086 - val_loss: -0.4680 - val_acc: 0.1364
- Epoch 19/1000
- 525/525 [==============================] - 0s 818us/step - loss: -0.7956 - acc: 0.3219 - val_loss: -0.4667 - val_acc: 0.1818
- Epoch 20/1000
- 525/525 [==============================] - 0s 824us/step - loss: -0.8034 - acc: 0.3257 - val_loss: -0.4669 - val_acc: 0.1515
- Epoch 21/1000
- 525/525 [==============================] - 0s 816us/step - loss: -0.8100 - acc: 0.3086 - val_loss: -0.4720 - val_acc: 0.1591
- Epoch 22/1000
- 525/525 [==============================] - 0s 825us/step - loss: -0.8169 - acc: 0.3219 - val_loss: -0.4731 - val_acc: 0.1742
- Epoch 23/1000
- 525/525 [==============================] - 0s 818us/step - loss: -0.8239 - acc: 0.3257 - val_loss: -0.4749 - val_acc: 0.1894
- Epoch 24/1000
- 525/525 [==============================] - 0s 838us/step - loss: -0.8297 - acc: 0.3486 - val_loss: -0.4812 - val_acc: 0.1818
- Epoch 25/1000
- 525/525 [==============================] - 0s 807us/step - loss: -0.8371 - acc: 0.3867 - val_loss: -0.4849 - val_acc: 0.2121
- Epoch 26/1000
- 525/525 [==============================] - 0s 814us/step - loss: -0.8420 - acc: 0.3695 - val_loss: -0.4840 - val_acc: 0.1894
- Epoch 27/1000
- 525/525 [==============================] - 0s 815us/step - loss: -0.8468 - acc: 0.3886 - val_loss: -0.4840 - val_acc: 0.1818
- Epoch 28/1000
- 525/525 [==============================] - 0s 805us/step - loss: -0.8525 - acc: 0.3848 - val_loss: -0.4874 - val_acc: 0.1742
- Epoch 29/1000
- 525/525 [==============================] - 0s 881us/step - loss: -0.8564 - acc: 0.3981 - val_loss: -0.4845 - val_acc: 0.1894
- Epoch 30/1000
- 525/525 [==============================] - 0s 820us/step - loss: -0.8618 - acc: 0.3981 - val_loss: -0.4905 - val_acc: 0.1970
- Epoch 31/1000
- 525/525 [==============================] - 0s 820us/step - loss: -0.8660 - acc: 0.4210 - val_loss: -0.4903 - val_acc: 0.1970
- Epoch 32/1000
- 525/525 [==============================] - 0s 806us/step - loss: -0.8693 - acc: 0.4133 - val_loss: -0.4900 - val_acc: 0.1818
- Epoch 33/1000
- 525/525 [==============================] - 0s 811us/step - loss: -0.8741 - acc: 0.4210 - val_loss: -0.4906 - val_acc: 0.2121
- Epoch 34/1000
- 525/525 [==============================] - 0s 817us/step - loss: -0.8780 - acc: 0.4267 - val_loss: -0.4880 - val_acc: 0.2045
- Epoch 35/1000
- 525/525 [==============================] - 0s 813us/step - loss: -0.8810 - acc: 0.4362 - val_loss: -0.4890 - val_acc: 0.1970
- Epoch 36/1000
- 525/525 [==============================] - 0s 852us/step - loss: -0.8838 - acc: 0.4419 - val_loss: -0.4950 - val_acc: 0.2045
- Epoch 37/1000
- 525/525 [==============================] - 0s 848us/step - loss: -0.8864 - acc: 0.4362 - val_loss: -0.4952 - val_acc: 0.2273
- Epoch 38/1000
- 525/525 [==============================] - 0s 826us/step - loss: -0.8896 - acc: 0.4438 - val_loss: -0.4927 - val_acc: 0.1818
- Epoch 39/1000
- 525/525 [==============================] - 0s 822us/step - loss: -0.8930 - acc: 0.4648 - val_loss: -0.4901 - val_acc: 0.1894
- Epoch 40/1000
- 525/525 [==============================] - 0s 830us/step - loss: -0.8962 - acc: 0.4686 - val_loss: -0.4978 - val_acc: 0.2197
- Epoch 41/1000
- 525/525 [==============================] - 0s 838us/step - loss: -0.8994 - acc: 0.4610 - val_loss: -0.4936 - val_acc: 0.2121
- Epoch 42/1000
- 525/525 [==============================] - 0s 806us/step - loss: -0.9024 - acc: 0.4667 - val_loss: -0.4930 - val_acc: 0.2121
- Epoch 43/1000
- 525/525 [==============================] - 0s 839us/step - loss: -0.9042 - acc: 0.4876 - val_loss: -0.5007 - val_acc: 0.2121
- Epoch 44/1000
- 525/525 [==============================] - 0s 830us/step - loss: -0.9059 - acc: 0.4781 - val_loss: -0.4916 - val_acc: 0.2045
- Epoch 45/1000
- 525/525 [==============================] - 0s 814us/step - loss: -0.9083 - acc: 0.4857 - val_loss: -0.4952 - val_acc: 0.1970
- Epoch 46/1000
- 525/525 [==============================] - 0s 811us/step - loss: -0.9108 - acc: 0.4914 - val_loss: -0.4914 - val_acc: 0.2197
- Epoch 47/1000
- 525/525 [==============================] - 0s 818us/step - loss: -0.9133 - acc: 0.5067 - val_loss: -0.4952 - val_acc: 0.2424
- Epoch 48/1000
- 525/525 [==============================] - 0s 801us/step - loss: -0.9151 - acc: 0.5200 - val_loss: -0.4952 - val_acc: 0.2121
- Epoch 49/1000
- 525/525 [==============================] - 0s 817us/step - loss: -0.9174 - acc: 0.5048 - val_loss: -0.4953 - val_acc: 0.2348
- Epoch 50/1000
- 525/525 [==============================] - 0s 809us/step - loss: -0.9193 - acc: 0.5067 - val_loss: -0.4946 - val_acc: 0.2500
- Epoch 51/1000
- 525/525 [==============================] - 0s 813us/step - loss: -0.9204 - acc: 0.5143 - val_loss: -0.4926 - val_acc: 0.2197
- Epoch 52/1000
- 525/525 [==============================] - 0s 828us/step - loss: -0.9216 - acc: 0.5181 - val_loss: -0.4962 - val_acc: 0.2348
- Epoch 53/1000
- 525/525 [==============================] - 0s 882us/step - loss: -0.9230 - acc: 0.5086 - val_loss: -0.4956 - val_acc: 0.1970
- Epoch 54/1000
- 525/525 [==============================] - 0s 834us/step - loss: -0.9250 - acc: 0.5619 - val_loss: -0.4955 - val_acc: 0.1970
- Epoch 55/1000
- 525/525 [==============================] - 0s 819us/step - loss: -0.9276 - acc: 0.5429 - val_loss: -0.4960 - val_acc: 0.2121
- Epoch 56/1000
- 525/525 [==============================] - 0s 828us/step - loss: -0.9296 - acc: 0.5486 - val_loss: -0.4982 - val_acc: 0.2121
- Epoch 57/1000
- 525/525 [==============================] - 0s 808us/step - loss: -0.9308 - acc: 0.5181 - val_loss: -0.4929 - val_acc: 0.2273
- Epoch 58/1000
- 525/525 [==============================] - 0s 825us/step - loss: -0.9314 - acc: 0.5371 - val_loss: -0.4970 - val_acc: 0.1970
- Epoch 59/1000
- 525/525 [==============================] - 0s 846us/step - loss: -0.9319 - acc: 0.5410 - val_loss: -0.4984 - val_acc: 0.1970
- Epoch 60/1000
- 525/525 [==============================] - 0s 816us/step - loss: -0.9331 - acc: 0.5867 - val_loss: -0.4995 - val_acc: 0.2348
- Epoch 61/1000
- 525/525 [==============================] - 0s 823us/step - loss: -0.9343 - acc: 0.5371 - val_loss: -0.4969 - val_acc: 0.2045
- Epoch 62/1000
- 525/525 [==============================] - 0s 828us/step - loss: -0.9355 - acc: 0.5905 - val_loss: -0.4987 - val_acc: 0.2273
- Epoch 63/1000
- 525/525 [==============================] - 0s 846us/step - loss: -0.9372 - acc: 0.5562 - val_loss: -0.4985 - val_acc: 0.1970
- Epoch 64/1000
- 525/525 [==============================] - 0s 806us/step - loss: -0.9384 - acc: 0.5638 - val_loss: -0.5022 - val_acc: 0.2424
- Epoch 65/1000
- 525/525 [==============================] - 0s 812us/step - loss: -0.9402 - acc: 0.5429 - val_loss: -0.5011 - val_acc: 0.2197
- Epoch 66/1000
- 525/525 [==============================] - 0s 818us/step - loss: -0.9412 - acc: 0.6000 - val_loss: -0.4994 - val_acc: 0.2197
- Epoch 67/1000
- 525/525 [==============================] - 0s 809us/step - loss: -0.9431 - acc: 0.5543 - val_loss: -0.4976 - val_acc: 0.2273
- Epoch 68/1000
- 525/525 [==============================] - 0s 818us/step - loss: -0.9447 - acc: 0.5943 - val_loss: -0.4995 - val_acc: 0.2273
- Epoch 69/1000
- 525/525 [==============================] - 0s 809us/step - loss: -0.9453 - acc: 0.5829 - val_loss: -0.5021 - val_acc: 0.2273
- Epoch 70/1000
- 525/525 [==============================] - 0s 828us/step - loss: -0.9455 - acc: 0.5848 - val_loss: -0.5001 - val_acc: 0.2348
- Epoch 71/1000
- 525/525 [==============================] - 0s 810us/step - loss: -0.9471 - acc: 0.6095 - val_loss: -0.5016 - val_acc: 0.2348
- Epoch 72/1000
- 525/525 [==============================] - 0s 814us/step - loss: -0.9481 - acc: 0.5924 - val_loss: -0.5008 - val_acc: 0.2273
- Epoch 73/1000
- 525/525 [==============================] - 0s 812us/step - loss: -0.9474 - acc: 0.5810 - val_loss: -0.5031 - val_acc: 0.2500
- Epoch 74/1000
- 525/525 [==============================] - 0s 816us/step - loss: -0.9473 - acc: 0.5924 - val_loss: -0.4984 - val_acc: 0.2045
- Epoch 75/1000
- 525/525 [==============================] - 0s 828us/step - loss: -0.9487 - acc: 0.6133 - val_loss: -0.5000 - val_acc: 0.2121
- Epoch 76/1000
- 525/525 [==============================] - 0s 855us/step - loss: -0.9500 - acc: 0.6190 - val_loss: -0.5031 - val_acc: 0.2045
- Epoch 77/1000
- 525/525 [==============================] - 0s 848us/step - loss: -0.9515 - acc: 0.6076 - val_loss: -0.5015 - val_acc: 0.2273
- Epoch 78/1000
- 525/525 [==============================] - 0s 883us/step - loss: -0.9529 - acc: 0.6057 - val_loss: -0.5018 - val_acc: 0.2273
- Epoch 79/1000
- 525/525 [==============================] - 0s 904us/step - loss: -0.9530 - acc: 0.6381 - val_loss: -0.5058 - val_acc: 0.2121
- Epoch 80/1000
- 525/525 [==============================] - 0s 902us/step - loss: -0.9538 - acc: 0.6267 - val_loss: -0.5034 - val_acc: 0.2424
- Epoch 81/1000
- 525/525 [==============================] - 0s 895us/step - loss: -0.9552 - acc: 0.6229 - val_loss: -0.5031 - val_acc: 0.2500
- Epoch 82/1000
- 525/525 [==============================] - 0s 900us/step - loss: -0.9558 - acc: 0.6229 - val_loss: -0.5052 - val_acc: 0.2424
- Epoch 83/1000
- 525/525 [==============================] - 0s 886us/step - loss: -0.9558 - acc: 0.6419 - val_loss: -0.5037 - val_acc: 0.2348
- Epoch 84/1000
- 525/525 [==============================] - 0s 895us/step - loss: -0.9557 - acc: 0.6229 - val_loss: -0.5028 - val_acc: 0.2348
- Epoch 85/1000
- 525/525 [==============================] - 0s 894us/step - loss: -0.9562 - acc: 0.6229 - val_loss: -0.5002 - val_acc: 0.2424
- Epoch 86/1000
- 525/525 [==============================] - 0s 888us/step - loss: -0.9576 - acc: 0.6305 - val_loss: -0.5025 - val_acc: 0.2424
- Epoch 87/1000
- 525/525 [==============================] - 0s 897us/step - loss: -0.9586 - acc: 0.6248 - val_loss: -0.5015 - val_acc: 0.2273
- Epoch 88/1000
- 525/525 [==============================] - 0s 892us/step - loss: -0.9589 - acc: 0.6400 - val_loss: -0.5023 - val_acc: 0.2424
- Epoch 89/1000
- 525/525 [==============================] - 0s 880us/step - loss: -0.9602 - acc: 0.6438 - val_loss: -0.5051 - val_acc: 0.2348
- Epoch 90/1000
- 525/525 [==============================] - 0s 891us/step - loss: -0.9606 - acc: 0.6305 - val_loss: -0.5023 - val_acc: 0.2273
- Epoch 91/1000
- 525/525 [==============================] - 0s 893us/step - loss: -0.9609 - acc: 0.6495 - val_loss: -0.5009 - val_acc: 0.2348
- Epoch 92/1000
- 525/525 [==============================] - 0s 892us/step - loss: -0.9620 - acc: 0.6381 - val_loss: -0.5049 - val_acc: 0.2273
- Epoch 93/1000
- 525/525 [==============================] - 0s 890us/step - loss: -0.9620 - acc: 0.6743 - val_loss: -0.5006 - val_acc: 0.2424
- Epoch 94/1000
- 525/525 [==============================] - 0s 900us/step - loss: -0.9621 - acc: 0.6514 - val_loss: -0.5047 - val_acc: 0.2576
- Epoch 95/1000
- 525/525 [==============================] - 0s 909us/step - loss: -0.9623 - acc: 0.6514 - val_loss: -0.5032 - val_acc: 0.2424
- Epoch 96/1000
- 525/525 [==============================] - 0s 901us/step - loss: -0.9629 - acc: 0.6476 - val_loss: -0.5035 - val_acc: 0.2197
- Epoch 97/1000
- 525/525 [==============================] - 0s 897us/step - loss: -0.9632 - acc: 0.6438 - val_loss: -0.5004 - val_acc: 0.2424
- Epoch 98/1000
- 525/525 [==============================] - 0s 917us/step - loss: -0.9630 - acc: 0.6648 - val_loss: -0.5013 - val_acc: 0.2348
- Epoch 99/1000
- 525/525 [==============================] - 1s 957us/step - loss: -0.9636 - acc: 0.6495 - val_loss: -0.5029 - val_acc: 0.2576
- Epoch 100/1000
- 525/525 [==============================] - 0s 839us/step - loss: -0.9641 - acc: 0.6629 - val_loss: -0.5036 - val_acc: 0.2500
- Epoch 101/1000
- 525/525 [==============================] - 0s 816us/step - loss: -0.9646 - acc: 0.6457 - val_loss: -0.5010 - val_acc: 0.2424
- ...
- ...
- ...
- ...
- Epoch 950/1000
- 525/525 [==============================] - 0s 819us/step - loss: -0.9962 - acc: 0.9086 - val_loss: -0.4933 - val_acc: 0.2045
- Epoch 951/1000
- 525/525 [==============================] - 0s 823us/step - loss: -0.9962 - acc: 0.9067 - val_loss: -0.4929 - val_acc: 0.2197
- Epoch 952/1000
- 525/525 [==============================] - 0s 831us/step - loss: -0.9962 - acc: 0.9105 - val_loss: -0.4920 - val_acc: 0.1970
- Epoch 953/1000
- 525/525 [==============================] - 0s 825us/step - loss: -0.9961 - acc: 0.9257 - val_loss: -0.4938 - val_acc: 0.2045
- Epoch 954/1000
- 525/525 [==============================] - 0s 835us/step - loss: -0.9961 - acc: 0.8971 - val_loss: -0.4967 - val_acc: 0.2045
- Epoch 955/1000
- 525/525 [==============================] - 0s 818us/step - loss: -0.9958 - acc: 0.8971 - val_loss: -0.4952 - val_acc: 0.2197
- Epoch 956/1000
- 525/525 [==============================] - 0s 820us/step - loss: -0.9954 - acc: 0.8724 - val_loss: -0.4947 - val_acc: 0.1894
- Epoch 957/1000
- 525/525 [==============================] - 0s 840us/step - loss: -0.9955 - acc: 0.9010 - val_loss: -0.4921 - val_acc: 0.1970
- Epoch 958/1000
- 525/525 [==============================] - 0s 814us/step - loss: -0.9954 - acc: 0.8914 - val_loss: -0.4961 - val_acc: 0.2121
- Epoch 959/1000
- 525/525 [==============================] - 0s 824us/step - loss: -0.9954 - acc: 0.8895 - val_loss: -0.4964 - val_acc: 0.2197
- Epoch 960/1000
- 525/525 [==============================] - 0s 818us/step - loss: -0.9952 - acc: 0.8743 - val_loss: -0.4969 - val_acc: 0.2197
- Epoch 961/1000
- 525/525 [==============================] - 0s 830us/step - loss: -0.9955 - acc: 0.9124 - val_loss: -0.4980 - val_acc: 0.2045
- Epoch 962/1000
- 525/525 [==============================] - 0s 849us/step - loss: -0.9955 - acc: 0.8667 - val_loss: -0.4948 - val_acc: 0.2273
- Epoch 963/1000
- 525/525 [==============================] - 0s 824us/step - loss: -0.9957 - acc: 0.9029 - val_loss: -0.4941 - val_acc: 0.2348
- Epoch 964/1000
- 525/525 [==============================] - 0s 827us/step - loss: -0.9955 - acc: 0.8838 - val_loss: -0.4932 - val_acc: 0.2121
- Epoch 965/1000
- 525/525 [==============================] - 0s 814us/step - loss: -0.9956 - acc: 0.8952 - val_loss: -0.4937 - val_acc: 0.2121
- Epoch 966/1000
- 525/525 [==============================] - 0s 823us/step - loss: -0.9958 - acc: 0.8914 - val_loss: -0.4956 - val_acc: 0.2197
- Epoch 967/1000
- 525/525 [==============================] - 0s 807us/step - loss: -0.9958 - acc: 0.9029 - val_loss: -0.4967 - val_acc: 0.2045
- Epoch 968/1000
- 525/525 [==============================] - 0s 846us/step - loss: -0.9957 - acc: 0.8952 - val_loss: -0.4952 - val_acc: 0.1970
- Epoch 969/1000
- 525/525 [==============================] - 0s 842us/step - loss: -0.9957 - acc: 0.9048 - val_loss: -0.4970 - val_acc: 0.1970
- Epoch 970/1000
- 525/525 [==============================] - 0s 836us/step - loss: -0.9958 - acc: 0.8914 - val_loss: -0.4972 - val_acc: 0.2045
- Epoch 971/1000
- 525/525 [==============================] - 0s 915us/step - loss: -0.9958 - acc: 0.8952 - val_loss: -0.4960 - val_acc: 0.2045
- Epoch 972/1000
- 525/525 [==============================] - 0s 816us/step - loss: -0.9958 - acc: 0.9048 - val_loss: -0.4951 - val_acc: 0.2273
- Epoch 973/1000
- 525/525 [==============================] - 0s 870us/step - loss: -0.9957 - acc: 0.9124 - val_loss: -0.4985 - val_acc: 0.2121
- Epoch 974/1000
- 525/525 [==============================] - 0s 822us/step - loss: -0.9958 - acc: 0.9010 - val_loss: -0.4977 - val_acc: 0.2348
- Epoch 975/1000
- 525/525 [==============================] - 0s 838us/step - loss: -0.9959 - acc: 0.8990 - val_loss: -0.4998 - val_acc: 0.2197
- Epoch 976/1000
- 525/525 [==============================] - 0s 813us/step - loss: -0.9957 - acc: 0.9143 - val_loss: -0.4982 - val_acc: 0.2197
- Epoch 977/1000
- 525/525 [==============================] - 0s 845us/step - loss: -0.9954 - acc: 0.9086 - val_loss: -0.4983 - val_acc: 0.2121
- Epoch 978/1000
- 525/525 [==============================] - 0s 813us/step - loss: -0.9953 - acc: 0.8914 - val_loss: -0.4982 - val_acc: 0.2121
- Epoch 979/1000
- 525/525 [==============================] - 0s 832us/step - loss: -0.9954 - acc: 0.8838 - val_loss: -0.4958 - val_acc: 0.2121
- Epoch 980/1000
- 525/525 [==============================] - 0s 822us/step - loss: -0.9954 - acc: 0.8838 - val_loss: -0.4953 - val_acc: 0.1970
- Epoch 981/1000
- 525/525 [==============================] - 0s 812us/step - loss: -0.9954 - acc: 0.8819 - val_loss: -0.4939 - val_acc: 0.2045
- Epoch 982/1000
- 525/525 [==============================] - 0s 822us/step - loss: -0.9954 - acc: 0.8857 - val_loss: -0.4960 - val_acc: 0.2197
- Epoch 983/1000
- 525/525 [==============================] - 0s 814us/step - loss: -0.9953 - acc: 0.8781 - val_loss: -0.4950 - val_acc: 0.2121
- Epoch 984/1000
- 525/525 [==============================] - 0s 825us/step - loss: -0.9953 - acc: 0.9219 - val_loss: -0.4943 - val_acc: 0.1667
- Epoch 985/1000
- 525/525 [==============================] - 0s 843us/step - loss: -0.9954 - acc: 0.8590 - val_loss: -0.4965 - val_acc: 0.2045
- Epoch 986/1000
- 525/525 [==============================] - 0s 826us/step - loss: -0.9955 - acc: 0.8971 - val_loss: -0.4930 - val_acc: 0.1742
- Epoch 987/1000
- 525/525 [==============================] - 0s 813us/step - loss: -0.9956 - acc: 0.8819 - val_loss: -0.4941 - val_acc: 0.2348
- Epoch 988/1000
- 525/525 [==============================] - 0s 822us/step - loss: -0.9956 - acc: 0.8838 - val_loss: -0.4929 - val_acc: 0.2045
- Epoch 989/1000
- 525/525 [==============================] - 0s 810us/step - loss: -0.9957 - acc: 0.8705 - val_loss: -0.4955 - val_acc: 0.2273
- Epoch 990/1000
- 525/525 [==============================] - 0s 833us/step - loss: -0.9959 - acc: 0.9105 - val_loss: -0.4947 - val_acc: 0.1970
- Epoch 991/1000
- 525/525 [==============================] - 0s 843us/step - loss: -0.9959 - acc: 0.9029 - val_loss: -0.4964 - val_acc: 0.2121
- Epoch 992/1000
- 525/525 [==============================] - 0s 827us/step - loss: -0.9959 - acc: 0.8952 - val_loss: -0.4954 - val_acc: 0.1970
- Epoch 993/1000
- 525/525 [==============================] - 0s 836us/step - loss: -0.9959 - acc: 0.8819 - val_loss: -0.4941 - val_acc: 0.2121
- Epoch 994/1000
- 525/525 [==============================] - 0s 822us/step - loss: -0.9960 - acc: 0.9162 - val_loss: -0.4961 - val_acc: 0.2348
- Epoch 995/1000
- 525/525 [==============================] - 0s 842us/step - loss: -0.9963 - acc: 0.8933 - val_loss: -0.4968 - val_acc: 0.1970
- Epoch 996/1000
- 525/525 [==============================] - 0s 865us/step - loss: -0.9965 - acc: 0.9257 - val_loss: -0.4966 - val_acc: 0.2121
- Epoch 997/1000
- 525/525 [==============================] - 0s 853us/step - loss: -0.9965 - acc: 0.9124 - val_loss: -0.4942 - val_acc: 0.2348
- Epoch 998/1000
- 525/525 [==============================] - 0s 815us/step - loss: -0.9965 - acc: 0.8990 - val_loss: -0.4960 - val_acc: 0.2273
- Epoch 999/1000
- 525/525 [==============================] - 0s 826us/step - loss: -0.9961 - acc: 0.9143 - val_loss: -0.4948 - val_acc: 0.2348
- Epoch 1000/1000
- 525/525 [==============================] - 0s 832us/step - loss: -0.9952 - acc: 0.8686 - val_loss: -0.4946 - val_acc: 0.2273
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement