Advertisement
Guest User

Untitled

a guest
Jun 25th, 2019
107
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 22.92 KB | None | 0 0
  1. WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/op_def_library.py:263: colocate_with (from tensorflow.python.framework.ops) is deprecated and will be removed in a future version.
  2. Instructions for updating:
  3. Colocations handled automatically by placer.
  4. /usr/local/lib/python3.6/dist-packages/ipykernel_launcher.py:25: DeprecationWarning: Call to deprecated `__getitem__` (Method will be removed in 4.0.0, use self.wv.__getitem__() instead).
  5. /usr/local/lib/python3.6/dist-packages/ipykernel_launcher.py:30: DeprecationWarning: Call to deprecated `__getitem__` (Method will be removed in 4.0.0, use self.wv.__getitem__() instead).
  6. /usr/local/lib/python3.6/dist-packages/ipykernel_launcher.py:35: DeprecationWarning: Call to deprecated `__getitem__` (Method will be removed in 4.0.0, use self.wv.__getitem__() instead).
  7. /usr/local/lib/python3.6/dist-packages/ipykernel_launcher.py:41: DeprecationWarning: Call to deprecated `__getitem__` (Method will be removed in 4.0.0, use self.wv.__getitem__() instead).
  8. /usr/local/lib/python3.6/dist-packages/ipykernel_launcher.py:15: UserWarning: Update your `LSTM` call to the Keras 2 API: `LSTM(100, batch_input_shape=[None, 20,..., return_sequences=False, activation="tanh", kernel_initializer="glorot_normal", recurrent_initializer="glorot_normal")`
  9. from ipykernel import kernelapp as app
  10. _________________________________________________________________
  11. Layer (type) Output Shape Param #
  12. =================================================================
  13. lstm_1 (LSTM) (None, 100) 80400
  14. =================================================================
  15. Total params: 80,400
  16. Trainable params: 80,400
  17. Non-trainable params: 0
  18. _________________________________________________________________
  19. WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/math_ops.py:3066: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.
  20. Instructions for updating:
  21. Use tf.cast instead.
  22. /usr/local/lib/python3.6/dist-packages/ipykernel_launcher.py:19: UserWarning: The `nb_epoch` argument in `fit` has been renamed `epochs`.
  23. Train on 525 samples, validate on 132 samples
  24. Epoch 1/1000
  25. 525/525 [==============================] - 1s 2ms/step - loss: -0.1752 - acc: 0.0152 - val_loss: -0.2922 - val_acc: 0.0379
  26. Epoch 2/1000
  27. 525/525 [==============================] - 0s 809us/step - loss: -0.3396 - acc: 0.0305 - val_loss: -0.3316 - val_acc: 0.0152
  28. Epoch 3/1000
  29. 525/525 [==============================] - 0s 865us/step - loss: -0.3962 - acc: 0.0419 - val_loss: -0.3498 - val_acc: 0.0303
  30. Epoch 4/1000
  31. 525/525 [==============================] - 0s 823us/step - loss: -0.4471 - acc: 0.0552 - val_loss: -0.3619 - val_acc: 0.0303
  32. Epoch 5/1000
  33. 525/525 [==============================] - 0s 861us/step - loss: -0.5016 - acc: 0.0838 - val_loss: -0.3770 - val_acc: 0.0455
  34. Epoch 6/1000
  35. 525/525 [==============================] - 0s 812us/step - loss: -0.5497 - acc: 0.1219 - val_loss: -0.3861 - val_acc: 0.0606
  36. Epoch 7/1000
  37. 525/525 [==============================] - 0s 817us/step - loss: -0.5891 - acc: 0.1486 - val_loss: -0.4022 - val_acc: 0.0530
  38. Epoch 8/1000
  39. 525/525 [==============================] - 0s 834us/step - loss: -0.6213 - acc: 0.1638 - val_loss: -0.4150 - val_acc: 0.0909
  40. Epoch 9/1000
  41. 525/525 [==============================] - 0s 819us/step - loss: -0.6480 - acc: 0.1943 - val_loss: -0.4160 - val_acc: 0.1061
  42. Epoch 10/1000
  43. 525/525 [==============================] - 0s 823us/step - loss: -0.6703 - acc: 0.1905 - val_loss: -0.4312 - val_acc: 0.0833
  44. Epoch 11/1000
  45. 525/525 [==============================] - 0s 822us/step - loss: -0.6908 - acc: 0.1962 - val_loss: -0.4343 - val_acc: 0.1061
  46. Epoch 12/1000
  47. 525/525 [==============================] - 0s 821us/step - loss: -0.7066 - acc: 0.2095 - val_loss: -0.4369 - val_acc: 0.0909
  48. Epoch 13/1000
  49. 525/525 [==============================] - 0s 820us/step - loss: -0.7223 - acc: 0.2267 - val_loss: -0.4420 - val_acc: 0.0985
  50. Epoch 14/1000
  51. 525/525 [==============================] - 0s 845us/step - loss: -0.7356 - acc: 0.2286 - val_loss: -0.4470 - val_acc: 0.1212
  52. Epoch 15/1000
  53. 525/525 [==============================] - 0s 850us/step - loss: -0.7505 - acc: 0.2743 - val_loss: -0.4537 - val_acc: 0.1061
  54. Epoch 16/1000
  55. 525/525 [==============================] - 0s 814us/step - loss: -0.7649 - acc: 0.2819 - val_loss: -0.4545 - val_acc: 0.1136
  56. Epoch 17/1000
  57. 525/525 [==============================] - 0s 827us/step - loss: -0.7760 - acc: 0.2933 - val_loss: -0.4621 - val_acc: 0.1439
  58. Epoch 18/1000
  59. 525/525 [==============================] - 0s 828us/step - loss: -0.7861 - acc: 0.3086 - val_loss: -0.4680 - val_acc: 0.1364
  60. Epoch 19/1000
  61. 525/525 [==============================] - 0s 818us/step - loss: -0.7956 - acc: 0.3219 - val_loss: -0.4667 - val_acc: 0.1818
  62. Epoch 20/1000
  63. 525/525 [==============================] - 0s 824us/step - loss: -0.8034 - acc: 0.3257 - val_loss: -0.4669 - val_acc: 0.1515
  64. Epoch 21/1000
  65. 525/525 [==============================] - 0s 816us/step - loss: -0.8100 - acc: 0.3086 - val_loss: -0.4720 - val_acc: 0.1591
  66. Epoch 22/1000
  67. 525/525 [==============================] - 0s 825us/step - loss: -0.8169 - acc: 0.3219 - val_loss: -0.4731 - val_acc: 0.1742
  68. Epoch 23/1000
  69. 525/525 [==============================] - 0s 818us/step - loss: -0.8239 - acc: 0.3257 - val_loss: -0.4749 - val_acc: 0.1894
  70. Epoch 24/1000
  71. 525/525 [==============================] - 0s 838us/step - loss: -0.8297 - acc: 0.3486 - val_loss: -0.4812 - val_acc: 0.1818
  72. Epoch 25/1000
  73. 525/525 [==============================] - 0s 807us/step - loss: -0.8371 - acc: 0.3867 - val_loss: -0.4849 - val_acc: 0.2121
  74. Epoch 26/1000
  75. 525/525 [==============================] - 0s 814us/step - loss: -0.8420 - acc: 0.3695 - val_loss: -0.4840 - val_acc: 0.1894
  76. Epoch 27/1000
  77. 525/525 [==============================] - 0s 815us/step - loss: -0.8468 - acc: 0.3886 - val_loss: -0.4840 - val_acc: 0.1818
  78. Epoch 28/1000
  79. 525/525 [==============================] - 0s 805us/step - loss: -0.8525 - acc: 0.3848 - val_loss: -0.4874 - val_acc: 0.1742
  80. Epoch 29/1000
  81. 525/525 [==============================] - 0s 881us/step - loss: -0.8564 - acc: 0.3981 - val_loss: -0.4845 - val_acc: 0.1894
  82. Epoch 30/1000
  83. 525/525 [==============================] - 0s 820us/step - loss: -0.8618 - acc: 0.3981 - val_loss: -0.4905 - val_acc: 0.1970
  84. Epoch 31/1000
  85. 525/525 [==============================] - 0s 820us/step - loss: -0.8660 - acc: 0.4210 - val_loss: -0.4903 - val_acc: 0.1970
  86. Epoch 32/1000
  87. 525/525 [==============================] - 0s 806us/step - loss: -0.8693 - acc: 0.4133 - val_loss: -0.4900 - val_acc: 0.1818
  88. Epoch 33/1000
  89. 525/525 [==============================] - 0s 811us/step - loss: -0.8741 - acc: 0.4210 - val_loss: -0.4906 - val_acc: 0.2121
  90. Epoch 34/1000
  91. 525/525 [==============================] - 0s 817us/step - loss: -0.8780 - acc: 0.4267 - val_loss: -0.4880 - val_acc: 0.2045
  92. Epoch 35/1000
  93. 525/525 [==============================] - 0s 813us/step - loss: -0.8810 - acc: 0.4362 - val_loss: -0.4890 - val_acc: 0.1970
  94. Epoch 36/1000
  95. 525/525 [==============================] - 0s 852us/step - loss: -0.8838 - acc: 0.4419 - val_loss: -0.4950 - val_acc: 0.2045
  96. Epoch 37/1000
  97. 525/525 [==============================] - 0s 848us/step - loss: -0.8864 - acc: 0.4362 - val_loss: -0.4952 - val_acc: 0.2273
  98. Epoch 38/1000
  99. 525/525 [==============================] - 0s 826us/step - loss: -0.8896 - acc: 0.4438 - val_loss: -0.4927 - val_acc: 0.1818
  100. Epoch 39/1000
  101. 525/525 [==============================] - 0s 822us/step - loss: -0.8930 - acc: 0.4648 - val_loss: -0.4901 - val_acc: 0.1894
  102. Epoch 40/1000
  103. 525/525 [==============================] - 0s 830us/step - loss: -0.8962 - acc: 0.4686 - val_loss: -0.4978 - val_acc: 0.2197
  104. Epoch 41/1000
  105. 525/525 [==============================] - 0s 838us/step - loss: -0.8994 - acc: 0.4610 - val_loss: -0.4936 - val_acc: 0.2121
  106. Epoch 42/1000
  107. 525/525 [==============================] - 0s 806us/step - loss: -0.9024 - acc: 0.4667 - val_loss: -0.4930 - val_acc: 0.2121
  108. Epoch 43/1000
  109. 525/525 [==============================] - 0s 839us/step - loss: -0.9042 - acc: 0.4876 - val_loss: -0.5007 - val_acc: 0.2121
  110. Epoch 44/1000
  111. 525/525 [==============================] - 0s 830us/step - loss: -0.9059 - acc: 0.4781 - val_loss: -0.4916 - val_acc: 0.2045
  112. Epoch 45/1000
  113. 525/525 [==============================] - 0s 814us/step - loss: -0.9083 - acc: 0.4857 - val_loss: -0.4952 - val_acc: 0.1970
  114. Epoch 46/1000
  115. 525/525 [==============================] - 0s 811us/step - loss: -0.9108 - acc: 0.4914 - val_loss: -0.4914 - val_acc: 0.2197
  116. Epoch 47/1000
  117. 525/525 [==============================] - 0s 818us/step - loss: -0.9133 - acc: 0.5067 - val_loss: -0.4952 - val_acc: 0.2424
  118. Epoch 48/1000
  119. 525/525 [==============================] - 0s 801us/step - loss: -0.9151 - acc: 0.5200 - val_loss: -0.4952 - val_acc: 0.2121
  120. Epoch 49/1000
  121. 525/525 [==============================] - 0s 817us/step - loss: -0.9174 - acc: 0.5048 - val_loss: -0.4953 - val_acc: 0.2348
  122. Epoch 50/1000
  123. 525/525 [==============================] - 0s 809us/step - loss: -0.9193 - acc: 0.5067 - val_loss: -0.4946 - val_acc: 0.2500
  124. Epoch 51/1000
  125. 525/525 [==============================] - 0s 813us/step - loss: -0.9204 - acc: 0.5143 - val_loss: -0.4926 - val_acc: 0.2197
  126. Epoch 52/1000
  127. 525/525 [==============================] - 0s 828us/step - loss: -0.9216 - acc: 0.5181 - val_loss: -0.4962 - val_acc: 0.2348
  128. Epoch 53/1000
  129. 525/525 [==============================] - 0s 882us/step - loss: -0.9230 - acc: 0.5086 - val_loss: -0.4956 - val_acc: 0.1970
  130. Epoch 54/1000
  131. 525/525 [==============================] - 0s 834us/step - loss: -0.9250 - acc: 0.5619 - val_loss: -0.4955 - val_acc: 0.1970
  132. Epoch 55/1000
  133. 525/525 [==============================] - 0s 819us/step - loss: -0.9276 - acc: 0.5429 - val_loss: -0.4960 - val_acc: 0.2121
  134. Epoch 56/1000
  135. 525/525 [==============================] - 0s 828us/step - loss: -0.9296 - acc: 0.5486 - val_loss: -0.4982 - val_acc: 0.2121
  136. Epoch 57/1000
  137. 525/525 [==============================] - 0s 808us/step - loss: -0.9308 - acc: 0.5181 - val_loss: -0.4929 - val_acc: 0.2273
  138. Epoch 58/1000
  139. 525/525 [==============================] - 0s 825us/step - loss: -0.9314 - acc: 0.5371 - val_loss: -0.4970 - val_acc: 0.1970
  140. Epoch 59/1000
  141. 525/525 [==============================] - 0s 846us/step - loss: -0.9319 - acc: 0.5410 - val_loss: -0.4984 - val_acc: 0.1970
  142. Epoch 60/1000
  143. 525/525 [==============================] - 0s 816us/step - loss: -0.9331 - acc: 0.5867 - val_loss: -0.4995 - val_acc: 0.2348
  144. Epoch 61/1000
  145. 525/525 [==============================] - 0s 823us/step - loss: -0.9343 - acc: 0.5371 - val_loss: -0.4969 - val_acc: 0.2045
  146. Epoch 62/1000
  147. 525/525 [==============================] - 0s 828us/step - loss: -0.9355 - acc: 0.5905 - val_loss: -0.4987 - val_acc: 0.2273
  148. Epoch 63/1000
  149. 525/525 [==============================] - 0s 846us/step - loss: -0.9372 - acc: 0.5562 - val_loss: -0.4985 - val_acc: 0.1970
  150. Epoch 64/1000
  151. 525/525 [==============================] - 0s 806us/step - loss: -0.9384 - acc: 0.5638 - val_loss: -0.5022 - val_acc: 0.2424
  152. Epoch 65/1000
  153. 525/525 [==============================] - 0s 812us/step - loss: -0.9402 - acc: 0.5429 - val_loss: -0.5011 - val_acc: 0.2197
  154. Epoch 66/1000
  155. 525/525 [==============================] - 0s 818us/step - loss: -0.9412 - acc: 0.6000 - val_loss: -0.4994 - val_acc: 0.2197
  156. Epoch 67/1000
  157. 525/525 [==============================] - 0s 809us/step - loss: -0.9431 - acc: 0.5543 - val_loss: -0.4976 - val_acc: 0.2273
  158. Epoch 68/1000
  159. 525/525 [==============================] - 0s 818us/step - loss: -0.9447 - acc: 0.5943 - val_loss: -0.4995 - val_acc: 0.2273
  160. Epoch 69/1000
  161. 525/525 [==============================] - 0s 809us/step - loss: -0.9453 - acc: 0.5829 - val_loss: -0.5021 - val_acc: 0.2273
  162. Epoch 70/1000
  163. 525/525 [==============================] - 0s 828us/step - loss: -0.9455 - acc: 0.5848 - val_loss: -0.5001 - val_acc: 0.2348
  164. Epoch 71/1000
  165. 525/525 [==============================] - 0s 810us/step - loss: -0.9471 - acc: 0.6095 - val_loss: -0.5016 - val_acc: 0.2348
  166. Epoch 72/1000
  167. 525/525 [==============================] - 0s 814us/step - loss: -0.9481 - acc: 0.5924 - val_loss: -0.5008 - val_acc: 0.2273
  168. Epoch 73/1000
  169. 525/525 [==============================] - 0s 812us/step - loss: -0.9474 - acc: 0.5810 - val_loss: -0.5031 - val_acc: 0.2500
  170. Epoch 74/1000
  171. 525/525 [==============================] - 0s 816us/step - loss: -0.9473 - acc: 0.5924 - val_loss: -0.4984 - val_acc: 0.2045
  172. Epoch 75/1000
  173. 525/525 [==============================] - 0s 828us/step - loss: -0.9487 - acc: 0.6133 - val_loss: -0.5000 - val_acc: 0.2121
  174. Epoch 76/1000
  175. 525/525 [==============================] - 0s 855us/step - loss: -0.9500 - acc: 0.6190 - val_loss: -0.5031 - val_acc: 0.2045
  176. Epoch 77/1000
  177. 525/525 [==============================] - 0s 848us/step - loss: -0.9515 - acc: 0.6076 - val_loss: -0.5015 - val_acc: 0.2273
  178. Epoch 78/1000
  179. 525/525 [==============================] - 0s 883us/step - loss: -0.9529 - acc: 0.6057 - val_loss: -0.5018 - val_acc: 0.2273
  180. Epoch 79/1000
  181. 525/525 [==============================] - 0s 904us/step - loss: -0.9530 - acc: 0.6381 - val_loss: -0.5058 - val_acc: 0.2121
  182. Epoch 80/1000
  183. 525/525 [==============================] - 0s 902us/step - loss: -0.9538 - acc: 0.6267 - val_loss: -0.5034 - val_acc: 0.2424
  184. Epoch 81/1000
  185. 525/525 [==============================] - 0s 895us/step - loss: -0.9552 - acc: 0.6229 - val_loss: -0.5031 - val_acc: 0.2500
  186. Epoch 82/1000
  187. 525/525 [==============================] - 0s 900us/step - loss: -0.9558 - acc: 0.6229 - val_loss: -0.5052 - val_acc: 0.2424
  188. Epoch 83/1000
  189. 525/525 [==============================] - 0s 886us/step - loss: -0.9558 - acc: 0.6419 - val_loss: -0.5037 - val_acc: 0.2348
  190. Epoch 84/1000
  191. 525/525 [==============================] - 0s 895us/step - loss: -0.9557 - acc: 0.6229 - val_loss: -0.5028 - val_acc: 0.2348
  192. Epoch 85/1000
  193. 525/525 [==============================] - 0s 894us/step - loss: -0.9562 - acc: 0.6229 - val_loss: -0.5002 - val_acc: 0.2424
  194. Epoch 86/1000
  195. 525/525 [==============================] - 0s 888us/step - loss: -0.9576 - acc: 0.6305 - val_loss: -0.5025 - val_acc: 0.2424
  196. Epoch 87/1000
  197. 525/525 [==============================] - 0s 897us/step - loss: -0.9586 - acc: 0.6248 - val_loss: -0.5015 - val_acc: 0.2273
  198. Epoch 88/1000
  199. 525/525 [==============================] - 0s 892us/step - loss: -0.9589 - acc: 0.6400 - val_loss: -0.5023 - val_acc: 0.2424
  200. Epoch 89/1000
  201. 525/525 [==============================] - 0s 880us/step - loss: -0.9602 - acc: 0.6438 - val_loss: -0.5051 - val_acc: 0.2348
  202. Epoch 90/1000
  203. 525/525 [==============================] - 0s 891us/step - loss: -0.9606 - acc: 0.6305 - val_loss: -0.5023 - val_acc: 0.2273
  204. Epoch 91/1000
  205. 525/525 [==============================] - 0s 893us/step - loss: -0.9609 - acc: 0.6495 - val_loss: -0.5009 - val_acc: 0.2348
  206. Epoch 92/1000
  207. 525/525 [==============================] - 0s 892us/step - loss: -0.9620 - acc: 0.6381 - val_loss: -0.5049 - val_acc: 0.2273
  208. Epoch 93/1000
  209. 525/525 [==============================] - 0s 890us/step - loss: -0.9620 - acc: 0.6743 - val_loss: -0.5006 - val_acc: 0.2424
  210. Epoch 94/1000
  211. 525/525 [==============================] - 0s 900us/step - loss: -0.9621 - acc: 0.6514 - val_loss: -0.5047 - val_acc: 0.2576
  212. Epoch 95/1000
  213. 525/525 [==============================] - 0s 909us/step - loss: -0.9623 - acc: 0.6514 - val_loss: -0.5032 - val_acc: 0.2424
  214. Epoch 96/1000
  215. 525/525 [==============================] - 0s 901us/step - loss: -0.9629 - acc: 0.6476 - val_loss: -0.5035 - val_acc: 0.2197
  216. Epoch 97/1000
  217. 525/525 [==============================] - 0s 897us/step - loss: -0.9632 - acc: 0.6438 - val_loss: -0.5004 - val_acc: 0.2424
  218. Epoch 98/1000
  219. 525/525 [==============================] - 0s 917us/step - loss: -0.9630 - acc: 0.6648 - val_loss: -0.5013 - val_acc: 0.2348
  220. Epoch 99/1000
  221. 525/525 [==============================] - 1s 957us/step - loss: -0.9636 - acc: 0.6495 - val_loss: -0.5029 - val_acc: 0.2576
  222. Epoch 100/1000
  223. 525/525 [==============================] - 0s 839us/step - loss: -0.9641 - acc: 0.6629 - val_loss: -0.5036 - val_acc: 0.2500
  224. Epoch 101/1000
  225. 525/525 [==============================] - 0s 816us/step - loss: -0.9646 - acc: 0.6457 - val_loss: -0.5010 - val_acc: 0.2424
  226. ...
  227. ...
  228. ...
  229. ...
  230. Epoch 950/1000
  231. 525/525 [==============================] - 0s 819us/step - loss: -0.9962 - acc: 0.9086 - val_loss: -0.4933 - val_acc: 0.2045
  232. Epoch 951/1000
  233. 525/525 [==============================] - 0s 823us/step - loss: -0.9962 - acc: 0.9067 - val_loss: -0.4929 - val_acc: 0.2197
  234. Epoch 952/1000
  235. 525/525 [==============================] - 0s 831us/step - loss: -0.9962 - acc: 0.9105 - val_loss: -0.4920 - val_acc: 0.1970
  236. Epoch 953/1000
  237. 525/525 [==============================] - 0s 825us/step - loss: -0.9961 - acc: 0.9257 - val_loss: -0.4938 - val_acc: 0.2045
  238. Epoch 954/1000
  239. 525/525 [==============================] - 0s 835us/step - loss: -0.9961 - acc: 0.8971 - val_loss: -0.4967 - val_acc: 0.2045
  240. Epoch 955/1000
  241. 525/525 [==============================] - 0s 818us/step - loss: -0.9958 - acc: 0.8971 - val_loss: -0.4952 - val_acc: 0.2197
  242. Epoch 956/1000
  243. 525/525 [==============================] - 0s 820us/step - loss: -0.9954 - acc: 0.8724 - val_loss: -0.4947 - val_acc: 0.1894
  244. Epoch 957/1000
  245. 525/525 [==============================] - 0s 840us/step - loss: -0.9955 - acc: 0.9010 - val_loss: -0.4921 - val_acc: 0.1970
  246. Epoch 958/1000
  247. 525/525 [==============================] - 0s 814us/step - loss: -0.9954 - acc: 0.8914 - val_loss: -0.4961 - val_acc: 0.2121
  248. Epoch 959/1000
  249. 525/525 [==============================] - 0s 824us/step - loss: -0.9954 - acc: 0.8895 - val_loss: -0.4964 - val_acc: 0.2197
  250. Epoch 960/1000
  251. 525/525 [==============================] - 0s 818us/step - loss: -0.9952 - acc: 0.8743 - val_loss: -0.4969 - val_acc: 0.2197
  252. Epoch 961/1000
  253. 525/525 [==============================] - 0s 830us/step - loss: -0.9955 - acc: 0.9124 - val_loss: -0.4980 - val_acc: 0.2045
  254. Epoch 962/1000
  255. 525/525 [==============================] - 0s 849us/step - loss: -0.9955 - acc: 0.8667 - val_loss: -0.4948 - val_acc: 0.2273
  256. Epoch 963/1000
  257. 525/525 [==============================] - 0s 824us/step - loss: -0.9957 - acc: 0.9029 - val_loss: -0.4941 - val_acc: 0.2348
  258. Epoch 964/1000
  259. 525/525 [==============================] - 0s 827us/step - loss: -0.9955 - acc: 0.8838 - val_loss: -0.4932 - val_acc: 0.2121
  260. Epoch 965/1000
  261. 525/525 [==============================] - 0s 814us/step - loss: -0.9956 - acc: 0.8952 - val_loss: -0.4937 - val_acc: 0.2121
  262. Epoch 966/1000
  263. 525/525 [==============================] - 0s 823us/step - loss: -0.9958 - acc: 0.8914 - val_loss: -0.4956 - val_acc: 0.2197
  264. Epoch 967/1000
  265. 525/525 [==============================] - 0s 807us/step - loss: -0.9958 - acc: 0.9029 - val_loss: -0.4967 - val_acc: 0.2045
  266. Epoch 968/1000
  267. 525/525 [==============================] - 0s 846us/step - loss: -0.9957 - acc: 0.8952 - val_loss: -0.4952 - val_acc: 0.1970
  268. Epoch 969/1000
  269. 525/525 [==============================] - 0s 842us/step - loss: -0.9957 - acc: 0.9048 - val_loss: -0.4970 - val_acc: 0.1970
  270. Epoch 970/1000
  271. 525/525 [==============================] - 0s 836us/step - loss: -0.9958 - acc: 0.8914 - val_loss: -0.4972 - val_acc: 0.2045
  272. Epoch 971/1000
  273. 525/525 [==============================] - 0s 915us/step - loss: -0.9958 - acc: 0.8952 - val_loss: -0.4960 - val_acc: 0.2045
  274. Epoch 972/1000
  275. 525/525 [==============================] - 0s 816us/step - loss: -0.9958 - acc: 0.9048 - val_loss: -0.4951 - val_acc: 0.2273
  276. Epoch 973/1000
  277. 525/525 [==============================] - 0s 870us/step - loss: -0.9957 - acc: 0.9124 - val_loss: -0.4985 - val_acc: 0.2121
  278. Epoch 974/1000
  279. 525/525 [==============================] - 0s 822us/step - loss: -0.9958 - acc: 0.9010 - val_loss: -0.4977 - val_acc: 0.2348
  280. Epoch 975/1000
  281. 525/525 [==============================] - 0s 838us/step - loss: -0.9959 - acc: 0.8990 - val_loss: -0.4998 - val_acc: 0.2197
  282. Epoch 976/1000
  283. 525/525 [==============================] - 0s 813us/step - loss: -0.9957 - acc: 0.9143 - val_loss: -0.4982 - val_acc: 0.2197
  284. Epoch 977/1000
  285. 525/525 [==============================] - 0s 845us/step - loss: -0.9954 - acc: 0.9086 - val_loss: -0.4983 - val_acc: 0.2121
  286. Epoch 978/1000
  287. 525/525 [==============================] - 0s 813us/step - loss: -0.9953 - acc: 0.8914 - val_loss: -0.4982 - val_acc: 0.2121
  288. Epoch 979/1000
  289. 525/525 [==============================] - 0s 832us/step - loss: -0.9954 - acc: 0.8838 - val_loss: -0.4958 - val_acc: 0.2121
  290. Epoch 980/1000
  291. 525/525 [==============================] - 0s 822us/step - loss: -0.9954 - acc: 0.8838 - val_loss: -0.4953 - val_acc: 0.1970
  292. Epoch 981/1000
  293. 525/525 [==============================] - 0s 812us/step - loss: -0.9954 - acc: 0.8819 - val_loss: -0.4939 - val_acc: 0.2045
  294. Epoch 982/1000
  295. 525/525 [==============================] - 0s 822us/step - loss: -0.9954 - acc: 0.8857 - val_loss: -0.4960 - val_acc: 0.2197
  296. Epoch 983/1000
  297. 525/525 [==============================] - 0s 814us/step - loss: -0.9953 - acc: 0.8781 - val_loss: -0.4950 - val_acc: 0.2121
  298. Epoch 984/1000
  299. 525/525 [==============================] - 0s 825us/step - loss: -0.9953 - acc: 0.9219 - val_loss: -0.4943 - val_acc: 0.1667
  300. Epoch 985/1000
  301. 525/525 [==============================] - 0s 843us/step - loss: -0.9954 - acc: 0.8590 - val_loss: -0.4965 - val_acc: 0.2045
  302. Epoch 986/1000
  303. 525/525 [==============================] - 0s 826us/step - loss: -0.9955 - acc: 0.8971 - val_loss: -0.4930 - val_acc: 0.1742
  304. Epoch 987/1000
  305. 525/525 [==============================] - 0s 813us/step - loss: -0.9956 - acc: 0.8819 - val_loss: -0.4941 - val_acc: 0.2348
  306. Epoch 988/1000
  307. 525/525 [==============================] - 0s 822us/step - loss: -0.9956 - acc: 0.8838 - val_loss: -0.4929 - val_acc: 0.2045
  308. Epoch 989/1000
  309. 525/525 [==============================] - 0s 810us/step - loss: -0.9957 - acc: 0.8705 - val_loss: -0.4955 - val_acc: 0.2273
  310. Epoch 990/1000
  311. 525/525 [==============================] - 0s 833us/step - loss: -0.9959 - acc: 0.9105 - val_loss: -0.4947 - val_acc: 0.1970
  312. Epoch 991/1000
  313. 525/525 [==============================] - 0s 843us/step - loss: -0.9959 - acc: 0.9029 - val_loss: -0.4964 - val_acc: 0.2121
  314. Epoch 992/1000
  315. 525/525 [==============================] - 0s 827us/step - loss: -0.9959 - acc: 0.8952 - val_loss: -0.4954 - val_acc: 0.1970
  316. Epoch 993/1000
  317. 525/525 [==============================] - 0s 836us/step - loss: -0.9959 - acc: 0.8819 - val_loss: -0.4941 - val_acc: 0.2121
  318. Epoch 994/1000
  319. 525/525 [==============================] - 0s 822us/step - loss: -0.9960 - acc: 0.9162 - val_loss: -0.4961 - val_acc: 0.2348
  320. Epoch 995/1000
  321. 525/525 [==============================] - 0s 842us/step - loss: -0.9963 - acc: 0.8933 - val_loss: -0.4968 - val_acc: 0.1970
  322. Epoch 996/1000
  323. 525/525 [==============================] - 0s 865us/step - loss: -0.9965 - acc: 0.9257 - val_loss: -0.4966 - val_acc: 0.2121
  324. Epoch 997/1000
  325. 525/525 [==============================] - 0s 853us/step - loss: -0.9965 - acc: 0.9124 - val_loss: -0.4942 - val_acc: 0.2348
  326. Epoch 998/1000
  327. 525/525 [==============================] - 0s 815us/step - loss: -0.9965 - acc: 0.8990 - val_loss: -0.4960 - val_acc: 0.2273
  328. Epoch 999/1000
  329. 525/525 [==============================] - 0s 826us/step - loss: -0.9961 - acc: 0.9143 - val_loss: -0.4948 - val_acc: 0.2348
  330. Epoch 1000/1000
  331. 525/525 [==============================] - 0s 832us/step - loss: -0.9952 - acc: 0.8686 - val_loss: -0.4946 - val_acc: 0.2273
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement