Guest User

TF Dump Training with one device

a guest
Nov 1st, 2019
145
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 42.02 KB | None | 0 0
  1. Output with '/gpu:0'
  2. ds_valid.shuffle(len(list_paths_train), seed=42,
  3. reshuffle_each_iteration=True)
  4.  
  5. 2019-11-01 09:53:21.406893: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1
  6. 2019-11-01 09:53:21.450823: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1546] Found device 0 with properties:
  7. pciBusID: 0000:86:00.0 name: GeForce RTX 2080 Ti computeCapability: 7.5
  8. coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s
  9. 2019-11-01 09:53:21.451467: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1546] Found device 1 with properties:
  10. pciBusID: 0000:af:00.0 name: GeForce RTX 2080 Ti computeCapability: 7.5
  11. coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s
  12. 2019-11-01 09:53:21.451755: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.0
  13. 2019-11-01 09:53:21.453373: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10.0
  14. 2019-11-01 09:53:21.454649: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10.0
  15. 2019-11-01 09:53:21.455396: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10.0
  16. 2019-11-01 09:53:21.456894: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10.0
  17. 2019-11-01 09:53:21.458026: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10.0
  18. 2019-11-01 09:53:21.461598: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7
  19. 2019-11-01 09:53:21.463936: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1688] Adding visible gpu devices: 0, 1
  20. 2019-11-01 09:53:21.464311: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 AVX512F FMA
  21. 2019-11-01 09:53:21.498095: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 2100000000 Hz
  22. 2019-11-01 09:53:21.508899: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x4995c00 initialized for platform Host (this does not guarantee that XLA will be used). Devices:
  23. 2019-11-01 09:53:21.508954: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version
  24. 2019-11-01 09:53:21.772746: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x4986d40 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
  25. 2019-11-01 09:53:21.772787: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): GeForce RTX 2080 Ti, Compute Capability 7.5
  26. 2019-11-01 09:53:21.772797: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (1): GeForce RTX 2080 Ti, Compute Capability 7.5
  27. 2019-11-01 09:53:21.774090: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1546] Found device 0 with properties:
  28. pciBusID: 0000:86:00.0 name: GeForce RTX 2080 Ti computeCapability: 7.5
  29. coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s
  30. 2019-11-01 09:53:21.774964: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1546] Found device 1 with properties:
  31. pciBusID: 0000:af:00.0 name: GeForce RTX 2080 Ti computeCapability: 7.5
  32. coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s
  33. 2019-11-01 09:53:21.775025: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.0
  34. 2019-11-01 09:53:21.775048: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10.0
  35. 2019-11-01 09:53:21.775067: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10.0
  36. 2019-11-01 09:53:21.775087: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10.0
  37. 2019-11-01 09:53:21.775107: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10.0
  38. 2019-11-01 09:53:21.775126: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10.0
  39. 2019-11-01 09:53:21.775147: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7
  40. 2019-11-01 09:53:21.778445: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1688] Adding visible gpu devices: 0, 1
  41. 2019-11-01 09:53:21.778500: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.0
  42. 2019-11-01 09:53:21.781150: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1087] Device interconnect StreamExecutor with strength 1 edge matrix:
  43. 2019-11-01 09:53:21.781172: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1093] 0 1
  44. 2019-11-01 09:53:21.781184: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1106] 0: N N
  45. 2019-11-01 09:53:21.781195: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1106] 1: N N
  46. 2019-11-01 09:53:21.783844: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1232] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 10312 MB memory) -> physical GPU (device: 0, name: GeForce RTX 2080 Ti, pci bus id: 0000:86:00.0, compute capability: 7.5)
  47. 2019-11-01 09:53:21.785587: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1232] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:1 with 10312 MB memory) -> physical GPU (device: 1, name: GeForce RTX 2080 Ti, pci bus id: 0000:af:00.0, compute capability: 7.5)
  48. Model: "resnet50"
  49. __________________________________________________________________________________________________
  50. Layer (type) Output Shape Param # Connected to
  51. ==================================================================================================
  52. input_1 (InputLayer) [(None, 270, 270, 1) 0
  53. __________________________________________________________________________________________________
  54. conv1_pad (ZeroPadding2D) (None, 276, 276, 1) 0 input_1[0][0]
  55. __________________________________________________________________________________________________
  56. conv1_conv (Conv2D) (None, 135, 135, 64) 3200 conv1_pad[0][0]
  57. __________________________________________________________________________________________________
  58. conv1_bn (BatchNormalization) (None, 135, 135, 64) 256 conv1_conv[0][0]
  59. __________________________________________________________________________________________________
  60. conv1_relu (Activation) (None, 135, 135, 64) 0 conv1_bn[0][0]
  61. __________________________________________________________________________________________________
  62. pool1_pad (ZeroPadding2D) (None, 137, 137, 64) 0 conv1_relu[0][0]
  63. __________________________________________________________________________________________________
  64. pool1_pool (MaxPooling2D) (None, 68, 68, 64) 0 pool1_pad[0][0]
  65. __________________________________________________________________________________________________
  66. conv2_block1_1_conv (Conv2D) (None, 68, 68, 64) 4160 pool1_pool[0][0]
  67. __________________________________________________________________________________________________
  68. conv2_block1_1_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block1_1_conv[0][0]
  69. __________________________________________________________________________________________________
  70. conv2_block1_1_relu (Activation (None, 68, 68, 64) 0 conv2_block1_1_bn[0][0]
  71. __________________________________________________________________________________________________
  72. conv2_block1_2_conv (Conv2D) (None, 68, 68, 64) 36928 conv2_block1_1_relu[0][0]
  73. __________________________________________________________________________________________________
  74. conv2_block1_2_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block1_2_conv[0][0]
  75. __________________________________________________________________________________________________
  76. conv2_block1_2_relu (Activation (None, 68, 68, 64) 0 conv2_block1_2_bn[0][0]
  77. __________________________________________________________________________________________________
  78. conv2_block1_0_conv (Conv2D) (None, 68, 68, 256) 16640 pool1_pool[0][0]
  79. __________________________________________________________________________________________________
  80. conv2_block1_3_conv (Conv2D) (None, 68, 68, 256) 16640 conv2_block1_2_relu[0][0]
  81. __________________________________________________________________________________________________
  82. conv2_block1_0_bn (BatchNormali (None, 68, 68, 256) 1024 conv2_block1_0_conv[0][0]
  83. __________________________________________________________________________________________________
  84. conv2_block1_3_bn (BatchNormali (None, 68, 68, 256) 1024 conv2_block1_3_conv[0][0]
  85. __________________________________________________________________________________________________
  86. conv2_block1_add (Add) (None, 68, 68, 256) 0 conv2_block1_0_bn[0][0]
  87. conv2_block1_3_bn[0][0]
  88. __________________________________________________________________________________________________
  89. conv2_block1_out (Activation) (None, 68, 68, 256) 0 conv2_block1_add[0][0]
  90. __________________________________________________________________________________________________
  91. conv2_block2_1_conv (Conv2D) (None, 68, 68, 64) 16448 conv2_block1_out[0][0]
  92. __________________________________________________________________________________________________
  93. conv2_block2_1_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block2_1_conv[0][0]
  94. __________________________________________________________________________________________________
  95. conv2_block2_1_relu (Activation (None, 68, 68, 64) 0 conv2_block2_1_bn[0][0]
  96. __________________________________________________________________________________________________
  97. conv2_block2_2_conv (Conv2D) (None, 68, 68, 64) 36928 conv2_block2_1_relu[0][0]
  98. __________________________________________________________________________________________________
  99. conv2_block2_2_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block2_2_conv[0][0]
  100. __________________________________________________________________________________________________
  101. conv2_block2_2_relu (Activation (None, 68, 68, 64) 0 conv2_block2_2_bn[0][0]
  102. __________________________________________________________________________________________________
  103. conv2_block2_3_conv (Conv2D) (None, 68, 68, 256) 16640 conv2_block2_2_relu[0][0]
  104. __________________________________________________________________________________________________
  105. conv2_block2_3_bn (BatchNormali (None, 68, 68, 256) 1024 conv2_block2_3_conv[0][0]
  106. __________________________________________________________________________________________________
  107. conv2_block2_add (Add) (None, 68, 68, 256) 0 conv2_block1_out[0][0]
  108. conv2_block2_3_bn[0][0]
  109. __________________________________________________________________________________________________
  110. conv2_block2_out (Activation) (None, 68, 68, 256) 0 conv2_block2_add[0][0]
  111. __________________________________________________________________________________________________
  112. conv2_block3_1_conv (Conv2D) (None, 68, 68, 64) 16448 conv2_block2_out[0][0]
  113. __________________________________________________________________________________________________
  114. conv2_block3_1_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block3_1_conv[0][0]
  115. __________________________________________________________________________________________________
  116. conv2_block3_1_relu (Activation (None, 68, 68, 64) 0 conv2_block3_1_bn[0][0]
  117. __________________________________________________________________________________________________
  118. conv2_block3_2_conv (Conv2D) (None, 68, 68, 64) 36928 conv2_block3_1_relu[0][0]
  119. __________________________________________________________________________________________________
  120. conv2_block3_2_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block3_2_conv[0][0]
  121. __________________________________________________________________________________________________
  122. conv2_block3_2_relu (Activation (None, 68, 68, 64) 0 conv2_block3_2_bn[0][0]
  123. __________________________________________________________________________________________________
  124. conv2_block3_3_conv (Conv2D) (None, 68, 68, 256) 16640 conv2_block3_2_relu[0][0]
  125. __________________________________________________________________________________________________
  126. conv2_block3_3_bn (BatchNormali (None, 68, 68, 256) 1024 conv2_block3_3_conv[0][0]
  127. __________________________________________________________________________________________________
  128. conv2_block3_add (Add) (None, 68, 68, 256) 0 conv2_block2_out[0][0]
  129. conv2_block3_3_bn[0][0]
  130. __________________________________________________________________________________________________
  131. conv2_block3_out (Activation) (None, 68, 68, 256) 0 conv2_block3_add[0][0]
  132. __________________________________________________________________________________________________
  133. conv3_block1_1_conv (Conv2D) (None, 34, 34, 128) 32896 conv2_block3_out[0][0]
  134. __________________________________________________________________________________________________
  135. conv3_block1_1_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block1_1_conv[0][0]
  136. __________________________________________________________________________________________________
  137. conv3_block1_1_relu (Activation (None, 34, 34, 128) 0 conv3_block1_1_bn[0][0]
  138. __________________________________________________________________________________________________
  139. conv3_block1_2_conv (Conv2D) (None, 34, 34, 128) 147584 conv3_block1_1_relu[0][0]
  140. __________________________________________________________________________________________________
  141. conv3_block1_2_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block1_2_conv[0][0]
  142. __________________________________________________________________________________________________
  143. conv3_block1_2_relu (Activation (None, 34, 34, 128) 0 conv3_block1_2_bn[0][0]
  144. __________________________________________________________________________________________________
  145. conv3_block1_0_conv (Conv2D) (None, 34, 34, 512) 131584 conv2_block3_out[0][0]
  146. __________________________________________________________________________________________________
  147. conv3_block1_3_conv (Conv2D) (None, 34, 34, 512) 66048 conv3_block1_2_relu[0][0]
  148. __________________________________________________________________________________________________
  149. conv3_block1_0_bn (BatchNormali (None, 34, 34, 512) 2048 conv3_block1_0_conv[0][0]
  150. __________________________________________________________________________________________________
  151. conv3_block1_3_bn (BatchNormali (None, 34, 34, 512) 2048 conv3_block1_3_conv[0][0]
  152. __________________________________________________________________________________________________
  153. conv3_block1_add (Add) (None, 34, 34, 512) 0 conv3_block1_0_bn[0][0]
  154. conv3_block1_3_bn[0][0]
  155. __________________________________________________________________________________________________
  156. conv3_block1_out (Activation) (None, 34, 34, 512) 0 conv3_block1_add[0][0]
  157. __________________________________________________________________________________________________
  158. conv3_block2_1_conv (Conv2D) (None, 34, 34, 128) 65664 conv3_block1_out[0][0]
  159. __________________________________________________________________________________________________
  160. conv3_block2_1_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block2_1_conv[0][0]
  161. __________________________________________________________________________________________________
  162. conv3_block2_1_relu (Activation (None, 34, 34, 128) 0 conv3_block2_1_bn[0][0]
  163. __________________________________________________________________________________________________
  164. conv3_block2_2_conv (Conv2D) (None, 34, 34, 128) 147584 conv3_block2_1_relu[0][0]
  165. __________________________________________________________________________________________________
  166. conv3_block2_2_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block2_2_conv[0][0]
  167. __________________________________________________________________________________________________
  168. conv3_block2_2_relu (Activation (None, 34, 34, 128) 0 conv3_block2_2_bn[0][0]
  169. __________________________________________________________________________________________________
  170. conv3_block2_3_conv (Conv2D) (None, 34, 34, 512) 66048 conv3_block2_2_relu[0][0]
  171. __________________________________________________________________________________________________
  172. conv3_block2_3_bn (BatchNormali (None, 34, 34, 512) 2048 conv3_block2_3_conv[0][0]
  173. __________________________________________________________________________________________________
  174. conv3_block2_add (Add) (None, 34, 34, 512) 0 conv3_block1_out[0][0]
  175. conv3_block2_3_bn[0][0]
  176. __________________________________________________________________________________________________
  177. conv3_block2_out (Activation) (None, 34, 34, 512) 0 conv3_block2_add[0][0]
  178. __________________________________________________________________________________________________
  179. conv3_block3_1_conv (Conv2D) (None, 34, 34, 128) 65664 conv3_block2_out[0][0]
  180. __________________________________________________________________________________________________
  181. conv3_block3_1_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block3_1_conv[0][0]
  182. __________________________________________________________________________________________________
  183. conv3_block3_1_relu (Activation (None, 34, 34, 128) 0 conv3_block3_1_bn[0][0]
  184. __________________________________________________________________________________________________
  185. conv3_block3_2_conv (Conv2D) (None, 34, 34, 128) 147584 conv3_block3_1_relu[0][0]
  186. __________________________________________________________________________________________________
  187. conv3_block3_2_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block3_2_conv[0][0]
  188. __________________________________________________________________________________________________
  189. conv3_block3_2_relu (Activation (None, 34, 34, 128) 0 conv3_block3_2_bn[0][0]
  190. __________________________________________________________________________________________________
  191. conv3_block3_3_conv (Conv2D) (None, 34, 34, 512) 66048 conv3_block3_2_relu[0][0]
  192. __________________________________________________________________________________________________
  193. conv3_block3_3_bn (BatchNormali (None, 34, 34, 512) 2048 conv3_block3_3_conv[0][0]
  194. __________________________________________________________________________________________________
  195. conv3_block3_add (Add) (None, 34, 34, 512) 0 conv3_block2_out[0][0]
  196. conv3_block3_3_bn[0][0]
  197. __________________________________________________________________________________________________
  198. conv3_block3_out (Activation) (None, 34, 34, 512) 0 conv3_block3_add[0][0]
  199. __________________________________________________________________________________________________
  200. conv3_block4_1_conv (Conv2D) (None, 34, 34, 128) 65664 conv3_block3_out[0][0]
  201. __________________________________________________________________________________________________
  202. conv3_block4_1_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block4_1_conv[0][0]
  203. __________________________________________________________________________________________________
  204. conv3_block4_1_relu (Activation (None, 34, 34, 128) 0 conv3_block4_1_bn[0][0]
  205. __________________________________________________________________________________________________
  206. conv3_block4_2_conv (Conv2D) (None, 34, 34, 128) 147584 conv3_block4_1_relu[0][0]
  207. __________________________________________________________________________________________________
  208. conv3_block4_2_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block4_2_conv[0][0]
  209. __________________________________________________________________________________________________
  210. conv3_block4_2_relu (Activation (None, 34, 34, 128) 0 conv3_block4_2_bn[0][0]
  211. __________________________________________________________________________________________________
  212. conv3_block4_3_conv (Conv2D) (None, 34, 34, 512) 66048 conv3_block4_2_relu[0][0]
  213. __________________________________________________________________________________________________
  214. conv3_block4_3_bn (BatchNormali (None, 34, 34, 512) 2048 conv3_block4_3_conv[0][0]
  215. __________________________________________________________________________________________________
  216. conv3_block4_add (Add) (None, 34, 34, 512) 0 conv3_block3_out[0][0]
  217. conv3_block4_3_bn[0][0]
  218. __________________________________________________________________________________________________
  219. conv3_block4_out (Activation) (None, 34, 34, 512) 0 conv3_block4_add[0][0]
  220. __________________________________________________________________________________________________
  221. conv4_block1_1_conv (Conv2D) (None, 17, 17, 256) 131328 conv3_block4_out[0][0]
  222. __________________________________________________________________________________________________
  223. conv4_block1_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block1_1_conv[0][0]
  224. __________________________________________________________________________________________________
  225. conv4_block1_1_relu (Activation (None, 17, 17, 256) 0 conv4_block1_1_bn[0][0]
  226. __________________________________________________________________________________________________
  227. conv4_block1_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block1_1_relu[0][0]
  228. __________________________________________________________________________________________________
  229. conv4_block1_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block1_2_conv[0][0]
  230. __________________________________________________________________________________________________
  231. conv4_block1_2_relu (Activation (None, 17, 17, 256) 0 conv4_block1_2_bn[0][0]
  232. __________________________________________________________________________________________________
  233. conv4_block1_0_conv (Conv2D) (None, 17, 17, 1024) 525312 conv3_block4_out[0][0]
  234. __________________________________________________________________________________________________
  235. conv4_block1_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block1_2_relu[0][0]
  236. __________________________________________________________________________________________________
  237. conv4_block1_0_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block1_0_conv[0][0]
  238. __________________________________________________________________________________________________
  239. conv4_block1_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block1_3_conv[0][0]
  240. __________________________________________________________________________________________________
  241. conv4_block1_add (Add) (None, 17, 17, 1024) 0 conv4_block1_0_bn[0][0]
  242. conv4_block1_3_bn[0][0]
  243. __________________________________________________________________________________________________
  244. conv4_block1_out (Activation) (None, 17, 17, 1024) 0 conv4_block1_add[0][0]
  245. __________________________________________________________________________________________________
  246. conv4_block2_1_conv (Conv2D) (None, 17, 17, 256) 262400 conv4_block1_out[0][0]
  247. __________________________________________________________________________________________________
  248. conv4_block2_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block2_1_conv[0][0]
  249. __________________________________________________________________________________________________
  250. conv4_block2_1_relu (Activation (None, 17, 17, 256) 0 conv4_block2_1_bn[0][0]
  251. __________________________________________________________________________________________________
  252. conv4_block2_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block2_1_relu[0][0]
  253. __________________________________________________________________________________________________
  254. conv4_block2_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block2_2_conv[0][0]
  255. __________________________________________________________________________________________________
  256. conv4_block2_2_relu (Activation (None, 17, 17, 256) 0 conv4_block2_2_bn[0][0]
  257. __________________________________________________________________________________________________
  258. conv4_block2_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block2_2_relu[0][0]
  259. __________________________________________________________________________________________________
  260. conv4_block2_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block2_3_conv[0][0]
  261. __________________________________________________________________________________________________
  262. conv4_block2_add (Add) (None, 17, 17, 1024) 0 conv4_block1_out[0][0]
  263. conv4_block2_3_bn[0][0]
  264. __________________________________________________________________________________________________
  265. conv4_block2_out (Activation) (None, 17, 17, 1024) 0 conv4_block2_add[0][0]
  266. __________________________________________________________________________________________________
  267. conv4_block3_1_conv (Conv2D) (None, 17, 17, 256) 262400 conv4_block2_out[0][0]
  268. __________________________________________________________________________________________________
  269. conv4_block3_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block3_1_conv[0][0]
  270. __________________________________________________________________________________________________
  271. conv4_block3_1_relu (Activation (None, 17, 17, 256) 0 conv4_block3_1_bn[0][0]
  272. __________________________________________________________________________________________________
  273. conv4_block3_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block3_1_relu[0][0]
  274. __________________________________________________________________________________________________
  275. conv4_block3_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block3_2_conv[0][0]
  276. __________________________________________________________________________________________________
  277. conv4_block3_2_relu (Activation (None, 17, 17, 256) 0 conv4_block3_2_bn[0][0]
  278. __________________________________________________________________________________________________
  279. conv4_block3_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block3_2_relu[0][0]
  280. __________________________________________________________________________________________________
  281. conv4_block3_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block3_3_conv[0][0]
  282. __________________________________________________________________________________________________
  283. conv4_block3_add (Add) (None, 17, 17, 1024) 0 conv4_block2_out[0][0]
  284. conv4_block3_3_bn[0][0]
  285. __________________________________________________________________________________________________
  286. conv4_block3_out (Activation) (None, 17, 17, 1024) 0 conv4_block3_add[0][0]
  287. __________________________________________________________________________________________________
  288. conv4_block4_1_conv (Conv2D) (None, 17, 17, 256) 262400 conv4_block3_out[0][0]
  289. __________________________________________________________________________________________________
  290. conv4_block4_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block4_1_conv[0][0]
  291. __________________________________________________________________________________________________
  292. conv4_block4_1_relu (Activation (None, 17, 17, 256) 0 conv4_block4_1_bn[0][0]
  293. __________________________________________________________________________________________________
  294. conv4_block4_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block4_1_relu[0][0]
  295. __________________________________________________________________________________________________
  296. conv4_block4_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block4_2_conv[0][0]
  297. __________________________________________________________________________________________________
  298. conv4_block4_2_relu (Activation (None, 17, 17, 256) 0 conv4_block4_2_bn[0][0]
  299. __________________________________________________________________________________________________
  300. conv4_block4_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block4_2_relu[0][0]
  301. __________________________________________________________________________________________________
  302. conv4_block4_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block4_3_conv[0][0]
  303. __________________________________________________________________________________________________
  304. conv4_block4_add (Add) (None, 17, 17, 1024) 0 conv4_block3_out[0][0]
  305. conv4_block4_3_bn[0][0]
  306. __________________________________________________________________________________________________
  307. conv4_block4_out (Activation) (None, 17, 17, 1024) 0 conv4_block4_add[0][0]
  308. __________________________________________________________________________________________________
  309. conv4_block5_1_conv (Conv2D) (None, 17, 17, 256) 262400 conv4_block4_out[0][0]
  310. __________________________________________________________________________________________________
  311. conv4_block5_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block5_1_conv[0][0]
  312. __________________________________________________________________________________________________
  313. conv4_block5_1_relu (Activation (None, 17, 17, 256) 0 conv4_block5_1_bn[0][0]
  314. __________________________________________________________________________________________________
  315. conv4_block5_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block5_1_relu[0][0]
  316. __________________________________________________________________________________________________
  317. conv4_block5_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block5_2_conv[0][0]
  318. __________________________________________________________________________________________________
  319. conv4_block5_2_relu (Activation (None, 17, 17, 256) 0 conv4_block5_2_bn[0][0]
  320. __________________________________________________________________________________________________
  321. conv4_block5_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block5_2_relu[0][0]
  322. __________________________________________________________________________________________________
  323. conv4_block5_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block5_3_conv[0][0]
  324. __________________________________________________________________________________________________
  325. conv4_block5_add (Add) (None, 17, 17, 1024) 0 conv4_block4_out[0][0]
  326. conv4_block5_3_bn[0][0]
  327. __________________________________________________________________________________________________
  328. conv4_block5_out (Activation) (None, 17, 17, 1024) 0 conv4_block5_add[0][0]
  329. __________________________________________________________________________________________________
  330. conv4_block6_1_conv (Conv2D) (None, 17, 17, 256) 262400 conv4_block5_out[0][0]
  331. __________________________________________________________________________________________________
  332. conv4_block6_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block6_1_conv[0][0]
  333. __________________________________________________________________________________________________
  334. conv4_block6_1_relu (Activation (None, 17, 17, 256) 0 conv4_block6_1_bn[0][0]
  335. __________________________________________________________________________________________________
  336. conv4_block6_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block6_1_relu[0][0]
  337. __________________________________________________________________________________________________
  338. conv4_block6_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block6_2_conv[0][0]
  339. __________________________________________________________________________________________________
  340. conv4_block6_2_relu (Activation (None, 17, 17, 256) 0 conv4_block6_2_bn[0][0]
  341. __________________________________________________________________________________________________
  342. conv4_block6_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block6_2_relu[0][0]
  343. __________________________________________________________________________________________________
  344. conv4_block6_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block6_3_conv[0][0]
  345. __________________________________________________________________________________________________
  346. conv4_block6_add (Add) (None, 17, 17, 1024) 0 conv4_block5_out[0][0]
  347. conv4_block6_3_bn[0][0]
  348. __________________________________________________________________________________________________
  349. conv4_block6_out (Activation) (None, 17, 17, 1024) 0 conv4_block6_add[0][0]
  350. __________________________________________________________________________________________________
  351. conv5_block1_1_conv (Conv2D) (None, 9, 9, 512) 524800 conv4_block6_out[0][0]
  352. __________________________________________________________________________________________________
  353. conv5_block1_1_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block1_1_conv[0][0]
  354. __________________________________________________________________________________________________
  355. conv5_block1_1_relu (Activation (None, 9, 9, 512) 0 conv5_block1_1_bn[0][0]
  356. __________________________________________________________________________________________________
  357. conv5_block1_2_conv (Conv2D) (None, 9, 9, 512) 2359808 conv5_block1_1_relu[0][0]
  358. __________________________________________________________________________________________________
  359. conv5_block1_2_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block1_2_conv[0][0]
  360. __________________________________________________________________________________________________
  361. conv5_block1_2_relu (Activation (None, 9, 9, 512) 0 conv5_block1_2_bn[0][0]
  362. __________________________________________________________________________________________________
  363. conv5_block1_0_conv (Conv2D) (None, 9, 9, 2048) 2099200 conv4_block6_out[0][0]
  364. __________________________________________________________________________________________________
  365. conv5_block1_3_conv (Conv2D) (None, 9, 9, 2048) 1050624 conv5_block1_2_relu[0][0]
  366. __________________________________________________________________________________________________
  367. conv5_block1_0_bn (BatchNormali (None, 9, 9, 2048) 8192 conv5_block1_0_conv[0][0]
  368. __________________________________________________________________________________________________
  369. conv5_block1_3_bn (BatchNormali (None, 9, 9, 2048) 8192 conv5_block1_3_conv[0][0]
  370. __________________________________________________________________________________________________
  371. conv5_block1_add (Add) (None, 9, 9, 2048) 0 conv5_block1_0_bn[0][0]
  372. conv5_block1_3_bn[0][0]
  373. __________________________________________________________________________________________________
  374. conv5_block1_out (Activation) (None, 9, 9, 2048) 0 conv5_block1_add[0][0]
  375. __________________________________________________________________________________________________
  376. conv5_block2_1_conv (Conv2D) (None, 9, 9, 512) 1049088 conv5_block1_out[0][0]
  377. __________________________________________________________________________________________________
  378. conv5_block2_1_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block2_1_conv[0][0]
  379. __________________________________________________________________________________________________
  380. conv5_block2_1_relu (Activation (None, 9, 9, 512) 0 conv5_block2_1_bn[0][0]
  381. __________________________________________________________________________________________________
  382. conv5_block2_2_conv (Conv2D) (None, 9, 9, 512) 2359808 conv5_block2_1_relu[0][0]
  383. __________________________________________________________________________________________________
  384. conv5_block2_2_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block2_2_conv[0][0]
  385. __________________________________________________________________________________________________
  386. conv5_block2_2_relu (Activation (None, 9, 9, 512) 0 conv5_block2_2_bn[0][0]
  387. __________________________________________________________________________________________________
  388. conv5_block2_3_conv (Conv2D) (None, 9, 9, 2048) 1050624 conv5_block2_2_relu[0][0]
  389. __________________________________________________________________________________________________
  390. conv5_block2_3_bn (BatchNormali (None, 9, 9, 2048) 8192 conv5_block2_3_conv[0][0]
  391. __________________________________________________________________________________________________
  392. conv5_block2_add (Add) (None, 9, 9, 2048) 0 conv5_block1_out[0][0]
  393. conv5_block2_3_bn[0][0]
  394. __________________________________________________________________________________________________
  395. conv5_block2_out (Activation) (None, 9, 9, 2048) 0 conv5_block2_add[0][0]
  396. __________________________________________________________________________________________________
  397. conv5_block3_1_conv (Conv2D) (None, 9, 9, 512) 1049088 conv5_block2_out[0][0]
  398. __________________________________________________________________________________________________
  399. conv5_block3_1_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block3_1_conv[0][0]
  400. __________________________________________________________________________________________________
  401. conv5_block3_1_relu (Activation (None, 9, 9, 512) 0 conv5_block3_1_bn[0][0]
  402. __________________________________________________________________________________________________
  403. conv5_block3_2_conv (Conv2D) (None, 9, 9, 512) 2359808 conv5_block3_1_relu[0][0]
  404. __________________________________________________________________________________________________
  405. conv5_block3_2_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block3_2_conv[0][0]
  406. __________________________________________________________________________________________________
  407. conv5_block3_2_relu (Activation (None, 9, 9, 512) 0 conv5_block3_2_bn[0][0]
  408. __________________________________________________________________________________________________
  409. conv5_block3_3_conv (Conv2D) (None, 9, 9, 2048) 1050624 conv5_block3_2_relu[0][0]
  410. __________________________________________________________________________________________________
  411. conv5_block3_3_bn (BatchNormali (None, 9, 9, 2048) 8192 conv5_block3_3_conv[0][0]
  412. __________________________________________________________________________________________________
  413. conv5_block3_add (Add) (None, 9, 9, 2048) 0 conv5_block2_out[0][0]
  414. conv5_block3_3_bn[0][0]
  415. __________________________________________________________________________________________________
  416. conv5_block3_out (Activation) (None, 9, 9, 2048) 0 conv5_block3_add[0][0]
  417. __________________________________________________________________________________________________
  418. avg_pool (GlobalAveragePooling2 (None, 2048) 0 conv5_block3_out[0][0]
  419. __________________________________________________________________________________________________
  420. probs (Dense) (None, 3) 6147 avg_pool[0][0]
  421. ==================================================================================================
  422. Total params: 23,587,587
  423. Trainable params: 23,534,467
  424. Non-trainable params: 53,120
  425. __________________________________________________________________________________________________
  426. Train for 563.0 steps, validate for 179.0 steps
  427. Epoch 1/10
  428. 2019-11-01 09:53:28.832570: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10.0
  429. 2019-11-01 09:53:30.675227: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7
  430. 563/563 [==============================] - 141s 250ms/step - loss: 0.1185 - accuracy: 0.9616 - val_loss: 0.5751 - val_accuracy: 0.8078
  431. Epoch 2/10
  432. 563/563 [==============================] - 130s 231ms/step - loss: 0.0400 - accuracy: 0.9865 - val_loss: 0.8953 - val_accuracy: 0.7119
  433. Epoch 3/10
  434. 563/563 [==============================] - 130s 231ms/step - loss: 0.0478 - accuracy: 0.9870 - val_loss: 25.3537 - val_accuracy: 0.3367
  435. Epoch 4/10
  436. 563/563 [==============================] - 130s 230ms/step - loss: 0.0309 - accuracy: 0.9906 - val_loss: 0.0576 - val_accuracy: 0.9946
  437. Epoch 5/10
  438. 563/563 [==============================] - 129s 230ms/step - loss: 0.0210 - accuracy: 0.9940 - val_loss: 0.0780 - val_accuracy: 0.9916
  439. Epoch 6/10
  440. 563/563 [==============================] - 130s 230ms/step - loss: 0.0227 - accuracy: 0.9937 - val_loss: 0.0595 - val_accuracy: 0.9887
  441. Epoch 7/10
  442. 563/563 [==============================] - 129s 230ms/step - loss: 0.0160 - accuracy: 0.9949 - val_loss: 0.0536 - val_accuracy: 0.9946
  443. Epoch 8/10
  444. 81/563 [===>..........................] - ETA: 1:39 - loss: 0.0222 - accuracy: 0.9945
Add Comment
Please, Sign In to add comment