Aaaaa988

OutPutFavarish

Dec 8th, 2020
86
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
  1. /usr/bin/python3.8 /home/vladimir/PycharmProjects/Lab4Favarish/Main.py
  2. 2020-12-09 08:27:55.203277: W tensorflow/stream_executor/platform/default/dso_loader.cc:59] Could not load dynamic library 'libcudart.so.10.1'; dlerror: libcudart.so.10.1: cannot open shared object file: No such file or directory
  3. 2020-12-09 08:27:55.203296: I tensorflow/stream_executor/cuda/cudart_stub.cc:29] Ignore above cudart dlerror if you do not have a GPU set up on your machine.
  4. [ 1.23247 0. 8.14 0. 0.538 6.142 91.7
  5. 3.9769 4. 307. 21. 396.9 18.72 ] 15.2
  6. [-0.27224633 -0.48361547 -0.43576161 -0.25683275 -0.1652266 -0.1764426
  7. 0.81306188 0.1166983 -0.62624905 -0.59517003 1.14850044 0.44807713
  8. 0.8252202 ]
  9. 2020-12-09 08:27:56.271947: W tensorflow/stream_executor/platform/default/dso_loader.cc:59] Could not load dynamic library 'libcuda.so.1'; dlerror: libcuda.so.1: cannot open shared object file: No such file or directory
  10. 2020-12-09 08:27:56.271965: W tensorflow/stream_executor/cuda/cuda_driver.cc:312] failed call to cuInit: UNKNOWN ERROR (303)
  11. 2020-12-09 08:27:56.271979: I tensorflow/stream_executor/cuda/cuda_diagnostics.cc:156] kernel driver does not appear to be running on this host (vladimir-37): /proc/driver/nvidia/version does not exist
  12. 2020-12-09 08:27:56.272137: I tensorflow/core/platform/cpu_feature_guard.cc:142] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN)to use the following CPU instructions in performance-critical operations: AVX2 FMA
  13. To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
  14. 2020-12-09 08:27:56.300480: I tensorflow/core/platform/profile_utils/cpu_utils.cc:104] CPU Frequency: 3393210000 Hz
  15. 2020-12-09 08:27:56.301233: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x5aa37e0 initialized for platform Host (this does not guarantee that XLA will be used). Devices:
  16. 2020-12-09 08:27:56.301279: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version
  17. Epoch 1/100
  18. 11/11 [==============================] - 0s 10ms/step - loss: 576.8145 - mae: 22.1994 - val_loss: 641.4199 - val_mae: 23.5561
  19. Epoch 2/100
  20. 11/11 [==============================] - 0s 2ms/step - loss: 563.5194 - mae: 21.8722 - val_loss: 630.0616 - val_mae: 23.3055
  21. Epoch 3/100
  22. 11/11 [==============================] - 0s 2ms/step - loss: 552.0070 - mae: 21.5808 - val_loss: 617.8162 - val_mae: 23.0309
  23. Epoch 4/100
  24. 11/11 [==============================] - 0s 3ms/step - loss: 539.6599 - mae: 21.2702 - val_loss: 603.6461 - val_mae: 22.7092
  25. Epoch 5/100
  26. 11/11 [==============================] - 0s 2ms/step - loss: 523.3814 - mae: 20.9230 - val_loss: 587.0852 - val_mae: 22.3302
  27. Epoch 6/100
  28. 11/11 [==============================] - 0s 2ms/step - loss: 504.8803 - mae: 20.4231 - val_loss: 567.0438 - val_mae: 21.8640
  29. Epoch 7/100
  30. 11/11 [==============================] - 0s 2ms/step - loss: 484.2834 - mae: 19.9024 - val_loss: 543.3279 - val_mae: 21.2940
  31. Epoch 8/100
  32. 11/11 [==============================] - 0s 2ms/step - loss: 460.0919 - mae: 19.3415 - val_loss: 515.3427 - val_mae: 20.6340
  33. Epoch 9/100
  34. 11/11 [==============================] - 0s 3ms/step - loss: 429.5299 - mae: 18.5452 - val_loss: 482.2537 - val_mae: 19.8361
  35. Epoch 10/100
  36. 11/11 [==============================] - 0s 3ms/step - loss: 391.0705 - mae: 17.4774 - val_loss: 447.2615 - val_mae: 18.9394
  37. Epoch 11/100
  38. 11/11 [==============================] - 0s 2ms/step - loss: 359.7870 - mae: 16.5852 - val_loss: 407.6415 - val_mae: 17.8810
  39. Epoch 12/100
  40. 11/11 [==============================] - 0s 2ms/step - loss: 326.6483 - mae: 15.6726 - val_loss: 365.7087 - val_mae: 16.7069
  41. Epoch 13/100
  42. 11/11 [==============================] - 0s 2ms/step - loss: 285.3968 - mae: 14.3208 - val_loss: 319.0248 - val_mae: 15.3907
  43. Epoch 14/100
  44. 11/11 [==============================] - 0s 2ms/step - loss: 240.1423 - mae: 13.0570 - val_loss: 271.4334 - val_mae: 13.9324
  45. Epoch 15/100
  46. 11/11 [==============================] - 0s 2ms/step - loss: 211.5306 - mae: 12.1574 - val_loss: 227.9761 - val_mae: 12.4525
  47. Epoch 16/100
  48. 11/11 [==============================] - 0s 2ms/step - loss: 174.4718 - mae: 10.9586 - val_loss: 189.2717 - val_mae: 11.0758
  49. Epoch 17/100
  50. 11/11 [==============================] - 0s 2ms/step - loss: 144.5379 - mae: 9.6482 - val_loss: 154.5965 - val_mae: 9.8076
  51. Epoch 18/100
  52. 11/11 [==============================] - 0s 3ms/step - loss: 129.8571 - mae: 9.4435 - val_loss: 127.4837 - val_mae: 8.7955
  53. Epoch 19/100
  54. 11/11 [==============================] - 0s 2ms/step - loss: 102.0622 - mae: 8.0646 - val_loss: 108.7950 - val_mae: 8.1062
  55. Epoch 20/100
  56. 11/11 [==============================] - 0s 2ms/step - loss: 102.8973 - mae: 8.0790 - val_loss: 94.3945 - val_mae: 7.5643
  57. Epoch 21/100
  58. 11/11 [==============================] - 0s 2ms/step - loss: 99.9749 - mae: 7.7646 - val_loss: 82.0209 - val_mae: 7.0236
  59. Epoch 22/100
  60. 11/11 [==============================] - 0s 3ms/step - loss: 86.0444 - mae: 7.1423 - val_loss: 72.6665 - val_mae: 6.5453
  61. Epoch 23/100
  62. 11/11 [==============================] - 0s 2ms/step - loss: 79.3709 - mae: 6.8148 - val_loss: 64.8036 - val_mae: 6.1072
  63. Epoch 24/100
  64. 11/11 [==============================] - 0s 2ms/step - loss: 71.4559 - mae: 6.3705 - val_loss: 57.6596 - val_mae: 5.7163
  65. Epoch 25/100
  66. 11/11 [==============================] - 0s 2ms/step - loss: 72.4916 - mae: 6.1992 - val_loss: 52.0030 - val_mae: 5.3804
  67. Epoch 26/100
  68. 11/11 [==============================] - 0s 2ms/step - loss: 71.3017 - mae: 6.2462 - val_loss: 48.2129 - val_mae: 5.1623
  69. Epoch 27/100
  70. 11/11 [==============================] - 0s 2ms/step - loss: 63.3928 - mae: 5.8350 - val_loss: 45.2233 - val_mae: 5.0121
  71. Epoch 28/100
  72. 11/11 [==============================] - 0s 2ms/step - loss: 52.5788 - mae: 5.4142 - val_loss: 42.3008 - val_mae: 4.8306
  73. Epoch 29/100
  74. 11/11 [==============================] - 0s 3ms/step - loss: 56.0609 - mae: 5.4573 - val_loss: 39.8106 - val_mae: 4.6675
  75. Epoch 30/100
  76. 11/11 [==============================] - 0s 2ms/step - loss: 59.6992 - mae: 5.9062 - val_loss: 37.4998 - val_mae: 4.5090
  77. Epoch 31/100
  78. 11/11 [==============================] - 0s 3ms/step - loss: 56.3936 - mae: 5.7147 - val_loss: 34.9231 - val_mae: 4.3458
  79. Epoch 32/100
  80. 11/11 [==============================] - 0s 2ms/step - loss: 60.2553 - mae: 5.7379 - val_loss: 33.2389 - val_mae: 4.2192
  81. Epoch 33/100
  82. 11/11 [==============================] - 0s 2ms/step - loss: 60.2691 - mae: 5.6960 - val_loss: 32.2640 - val_mae: 4.1523
  83. Epoch 34/100
  84. 11/11 [==============================] - 0s 2ms/step - loss: 51.3757 - mae: 5.2678 - val_loss: 32.2051 - val_mae: 4.1365
  85. Epoch 35/100
  86. 11/11 [==============================] - 0s 2ms/step - loss: 56.7672 - mae: 5.3226 - val_loss: 31.6404 - val_mae: 4.1021
  87. Epoch 36/100
  88. 11/11 [==============================] - 0s 2ms/step - loss: 48.0868 - mae: 5.3319 - val_loss: 29.7254 - val_mae: 3.9968
  89. Epoch 37/100
  90. 11/11 [==============================] - 0s 2ms/step - loss: 48.8782 - mae: 5.3105 - val_loss: 28.8839 - val_mae: 3.9622
  91. Epoch 38/100
  92. 11/11 [==============================] - 0s 2ms/step - loss: 53.7650 - mae: 5.3322 - val_loss: 28.2599 - val_mae: 3.9354
  93. Epoch 39/100
  94. 11/11 [==============================] - 0s 2ms/step - loss: 51.1862 - mae: 5.3991 - val_loss: 27.8721 - val_mae: 3.8960
  95. Epoch 40/100
  96. 11/11 [==============================] - 0s 2ms/step - loss: 54.3165 - mae: 5.4648 - val_loss: 27.1163 - val_mae: 3.8550
  97. Epoch 41/100
  98. 11/11 [==============================] - 0s 2ms/step - loss: 53.5490 - mae: 5.5076 - val_loss: 26.8804 - val_mae: 3.8364
  99. Epoch 42/100
  100. 11/11 [==============================] - 0s 2ms/step - loss: 51.7944 - mae: 5.2776 - val_loss: 26.3077 - val_mae: 3.7998
  101. Epoch 43/100
  102. 11/11 [==============================] - 0s 2ms/step - loss: 53.4757 - mae: 5.5201 - val_loss: 25.5795 - val_mae: 3.7579
  103. Epoch 44/100
  104. 11/11 [==============================] - 0s 2ms/step - loss: 42.2587 - mae: 5.0097 - val_loss: 25.4256 - val_mae: 3.7299
  105. Epoch 45/100
  106. 11/11 [==============================] - 0s 2ms/step - loss: 43.4811 - mae: 4.8368 - val_loss: 25.1915 - val_mae: 3.7172
  107. Epoch 46/100
  108. 11/11 [==============================] - 0s 2ms/step - loss: 41.6909 - mae: 4.9176 - val_loss: 25.9457 - val_mae: 3.7313
  109. Epoch 47/100
  110. 11/11 [==============================] - 0s 2ms/step - loss: 48.3462 - mae: 5.2766 - val_loss: 25.9713 - val_mae: 3.7190
  111. Epoch 48/100
  112. 11/11 [==============================] - 0s 2ms/step - loss: 45.0240 - mae: 4.8330 - val_loss: 25.3599 - val_mae: 3.6899
  113. Epoch 49/100
  114. 11/11 [==============================] - 0s 2ms/step - loss: 57.6347 - mae: 5.4571 - val_loss: 24.3261 - val_mae: 3.6471
  115. Epoch 50/100
  116. 11/11 [==============================] - 0s 2ms/step - loss: 43.8849 - mae: 5.1204 - val_loss: 23.8390 - val_mae: 3.6152
  117. Epoch 51/100
  118. 11/11 [==============================] - 0s 2ms/step - loss: 49.3554 - mae: 5.3719 - val_loss: 24.0915 - val_mae: 3.6126
  119. Epoch 52/100
  120. 11/11 [==============================] - 0s 2ms/step - loss: 45.8415 - mae: 5.0247 - val_loss: 23.5042 - val_mae: 3.5782
  121. Epoch 53/100
  122. 11/11 [==============================] - 0s 2ms/step - loss: 46.3078 - mae: 4.9748 - val_loss: 22.9240 - val_mae: 3.5548
  123. Epoch 54/100
  124. 11/11 [==============================] - 0s 2ms/step - loss: 53.2305 - mae: 5.3425 - val_loss: 22.0383 - val_mae: 3.5051
  125. Epoch 55/100
  126. 11/11 [==============================] - 0s 2ms/step - loss: 52.5778 - mae: 5.3962 - val_loss: 21.7672 - val_mae: 3.4816
  127. Epoch 56/100
  128. 11/11 [==============================] - 0s 2ms/step - loss: 52.2437 - mae: 5.1052 - val_loss: 21.1468 - val_mae: 3.4423
  129. Epoch 57/100
  130. 11/11 [==============================] - 0s 2ms/step - loss: 45.7289 - mae: 4.9726 - val_loss: 21.1030 - val_mae: 3.4359
  131. Epoch 58/100
  132. 11/11 [==============================] - 0s 2ms/step - loss: 39.5403 - mae: 4.8885 - val_loss: 21.7102 - val_mae: 3.4766
  133. Epoch 59/100
  134. 11/11 [==============================] - 0s 2ms/step - loss: 37.2999 - mae: 4.5723 - val_loss: 21.4789 - val_mae: 3.4582
  135. Epoch 60/100
  136. 11/11 [==============================] - 0s 2ms/step - loss: 46.1910 - mae: 5.2122 - val_loss: 21.1240 - val_mae: 3.4219
  137. Epoch 61/100
  138. 11/11 [==============================] - 0s 2ms/step - loss: 40.9039 - mae: 4.7818 - val_loss: 20.4178 - val_mae: 3.3793
  139. Epoch 62/100
  140. 11/11 [==============================] - 0s 2ms/step - loss: 52.8687 - mae: 5.2780 - val_loss: 19.3926 - val_mae: 3.3548
  141. Epoch 63/100
  142. 11/11 [==============================] - 0s 2ms/step - loss: 48.2449 - mae: 5.1098 - val_loss: 18.9752 - val_mae: 3.3829
  143. Epoch 64/100
  144. 11/11 [==============================] - 0s 2ms/step - loss: 37.8453 - mae: 4.8022 - val_loss: 19.1032 - val_mae: 3.3818
  145. Epoch 65/100
  146. 11/11 [==============================] - 0s 3ms/step - loss: 43.5561 - mae: 4.9497 - val_loss: 19.2405 - val_mae: 3.3739
  147. Epoch 66/100
  148. 11/11 [==============================] - 0s 2ms/step - loss: 42.2983 - mae: 4.7631 - val_loss: 19.2956 - val_mae: 3.3571
  149. Epoch 67/100
  150. 11/11 [==============================] - 0s 2ms/step - loss: 45.6628 - mae: 5.0653 - val_loss: 19.4573 - val_mae: 3.3349
  151. Epoch 68/100
  152. 11/11 [==============================] - 0s 2ms/step - loss: 42.5830 - mae: 4.9285 - val_loss: 19.6972 - val_mae: 3.3392
  153. Epoch 69/100
  154. 11/11 [==============================] - 0s 2ms/step - loss: 49.0086 - mae: 5.0308 - val_loss: 19.4687 - val_mae: 3.3548
  155. Epoch 70/100
  156. 11/11 [==============================] - 0s 2ms/step - loss: 44.4467 - mae: 4.9450 - val_loss: 18.9145 - val_mae: 3.3456
  157. Epoch 71/100
  158. 11/11 [==============================] - 0s 2ms/step - loss: 47.1978 - mae: 5.0942 - val_loss: 18.7678 - val_mae: 3.3226
  159. Epoch 72/100
  160. 11/11 [==============================] - 0s 2ms/step - loss: 38.6933 - mae: 4.6912 - val_loss: 18.3785 - val_mae: 3.3442
  161. Epoch 73/100
  162. 11/11 [==============================] - 0s 2ms/step - loss: 42.0594 - mae: 4.7792 - val_loss: 18.2005 - val_mae: 3.3474
  163. Epoch 74/100
  164. 11/11 [==============================] - 0s 2ms/step - loss: 38.6165 - mae: 4.6123 - val_loss: 18.1768 - val_mae: 3.3238
  165. Epoch 75/100
  166. 11/11 [==============================] - 0s 2ms/step - loss: 46.6481 - mae: 5.0438 - val_loss: 18.5303 - val_mae: 3.2968
  167. Epoch 76/100
  168. 11/11 [==============================] - 0s 2ms/step - loss: 42.1162 - mae: 4.8070 - val_loss: 18.0918 - val_mae: 3.2971
  169. Epoch 77/100
  170. 11/11 [==============================] - 0s 2ms/step - loss: 45.1461 - mae: 5.0273 - val_loss: 17.6394 - val_mae: 3.3092
  171. Epoch 78/100
  172. 11/11 [==============================] - 0s 2ms/step - loss: 43.6702 - mae: 4.7935 - val_loss: 17.9443 - val_mae: 3.2686
  173. Epoch 79/100
  174. 11/11 [==============================] - 0s 3ms/step - loss: 44.3730 - mae: 4.8525 - val_loss: 18.1074 - val_mae: 3.2552
  175. Epoch 80/100
  176. 11/11 [==============================] - 0s 2ms/step - loss: 38.8404 - mae: 4.8214 - val_loss: 17.4582 - val_mae: 3.2072
  177. Epoch 81/100
  178. 11/11 [==============================] - 0s 2ms/step - loss: 35.4778 - mae: 4.2712 - val_loss: 16.6305 - val_mae: 3.1722
  179. Epoch 82/100
  180. 11/11 [==============================] - 0s 2ms/step - loss: 40.3779 - mae: 4.6441 - val_loss: 16.3505 - val_mae: 3.1571
  181. Epoch 83/100
  182. 11/11 [==============================] - 0s 2ms/step - loss: 35.3169 - mae: 4.4666 - val_loss: 16.2556 - val_mae: 3.1738
  183. Epoch 84/100
  184. 11/11 [==============================] - 0s 2ms/step - loss: 35.0788 - mae: 4.4772 - val_loss: 16.4577 - val_mae: 3.2075
  185. Epoch 85/100
  186. 11/11 [==============================] - 0s 2ms/step - loss: 41.1912 - mae: 4.4931 - val_loss: 16.8850 - val_mae: 3.2568
  187. Epoch 86/100
  188. 11/11 [==============================] - 0s 3ms/step - loss: 40.0670 - mae: 4.8360 - val_loss: 17.1954 - val_mae: 3.2492
  189. Epoch 87/100
  190. 11/11 [==============================] - 0s 2ms/step - loss: 39.3397 - mae: 4.5578 - val_loss: 16.8293 - val_mae: 3.2490
  191. Epoch 88/100
  192. 11/11 [==============================] - 0s 2ms/step - loss: 40.0756 - mae: 4.6173 - val_loss: 16.1711 - val_mae: 3.2171
  193. Epoch 89/100
  194. 11/11 [==============================] - 0s 2ms/step - loss: 41.2723 - mae: 4.6326 - val_loss: 15.9994 - val_mae: 3.1969
  195. Epoch 90/100
  196. 11/11 [==============================] - 0s 2ms/step - loss: 41.7719 - mae: 4.6763 - val_loss: 15.8476 - val_mae: 3.1615
  197. Epoch 91/100
  198. 11/11 [==============================] - 0s 2ms/step - loss: 38.9935 - mae: 4.6145 - val_loss: 16.0321 - val_mae: 3.1583
  199. Epoch 92/100
  200. 11/11 [==============================] - 0s 3ms/step - loss: 41.0565 - mae: 4.6948 - val_loss: 16.0813 - val_mae: 3.1563
  201. Epoch 93/100
  202. 11/11 [==============================] - 0s 2ms/step - loss: 40.9692 - mae: 4.8904 - val_loss: 16.1480 - val_mae: 3.1871
  203. Epoch 94/100
  204. 11/11 [==============================] - 0s 2ms/step - loss: 37.8654 - mae: 4.6688 - val_loss: 16.3716 - val_mae: 3.2067
  205. Epoch 95/100
  206. 11/11 [==============================] - 0s 2ms/step - loss: 36.5145 - mae: 4.5671 - val_loss: 16.0650 - val_mae: 3.1625
  207. Epoch 96/100
  208. 11/11 [==============================] - 0s 2ms/step - loss: 37.9110 - mae: 4.6390 - val_loss: 15.5759 - val_mae: 3.1075
  209. Epoch 97/100
  210. 11/11 [==============================] - 0s 3ms/step - loss: 37.3476 - mae: 4.4792 - val_loss: 15.3985 - val_mae: 3.0717
  211. Epoch 98/100
  212. 11/11 [==============================] - 0s 2ms/step - loss: 39.3187 - mae: 4.6493 - val_loss: 15.7755 - val_mae: 3.0886
  213. Epoch 99/100
  214. 11/11 [==============================] - 0s 2ms/step - loss: 31.7008 - mae: 4.2392 - val_loss: 15.7887 - val_mae: 3.0876
  215. Epoch 100/100
  216. 11/11 [==============================] - 0s 3ms/step - loss: 43.6499 - mae: 4.8637 - val_loss: 15.8189 - val_mae: 3.0975
  217. 4/4 [==============================] - 0s 359us/step - loss: 27.4786 - mae: 3.7884
  218. Предсказанная цена и фактическая
  219. 8394.10$ vs 7200.00$
  220. 16383.21$ vs 18800.00$
  221. [[ 1.23247 0. 8.14 0. 0.538 6.142 91.7
  222. 3.9769 4. 50. 21. 396.9 18.72 ]]
  223. [[-0.27224633 -0.48361547 -0.43576161 -0.25683275 -0.1652266 -0.1764426
  224. 0.81306188 0.1166983 -0.62624905 -2.14179283 1.14850044 0.44807713
  225. 0.8252202 ]]
  226. Дом в таком районе мог бы стоить = 16468.57$
  227.  
  228. Process finished with exit code 0
  229.  
RAW Paste Data