Guest User

TF Dump Training with distribution strategy

a guest
Nov 1st, 2019
126
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 42.34 KB | None | 0 0
  1. Output with MirroredStrategy
  2. ds_valid.shuffle(len(list_paths_valid), seed=42,
  3. reshuffle_each_iteration=True)
  4.  
  5.  
  6. 2019-11-01 10:25:31.215552: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1
  7. 2019-11-01 10:25:31.295080: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1546] Found device 0 with properties:
  8. pciBusID: 0000:86:00.0 name: GeForce RTX 2080 Ti computeCapability: 7.5
  9. coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s
  10. 2019-11-01 10:25:31.295801: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1546] Found device 1 with properties:
  11. pciBusID: 0000:af:00.0 name: GeForce RTX 2080 Ti computeCapability: 7.5
  12. coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s
  13. 2019-11-01 10:25:31.296119: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.0
  14. 2019-11-01 10:25:31.298252: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10.0
  15. 2019-11-01 10:25:31.299728: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10.0
  16. 2019-11-01 10:25:31.300093: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10.0
  17. 2019-11-01 10:25:31.302095: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10.0
  18. 2019-11-01 10:25:31.303592: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10.0
  19. 2019-11-01 10:25:31.308127: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7
  20. 2019-11-01 10:25:31.310488: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1688] Adding visible gpu devices: 0, 1
  21. 2019-11-01 10:25:31.310882: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 AVX512F FMA
  22. 2019-11-01 10:25:31.354089: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 2100000000 Hz
  23. 2019-11-01 10:25:31.365967: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x57ab6f0 initialized for platform Host (this does not guarantee that XLA will be used). Devices:
  24. 2019-11-01 10:25:31.366024: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version
  25. 2019-11-01 10:25:31.584146: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x53772a0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
  26. 2019-11-01 10:25:31.584201: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): GeForce RTX 2080 Ti, Compute Capability 7.5
  27. 2019-11-01 10:25:31.584216: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (1): GeForce RTX 2080 Ti, Compute Capability 7.5
  28. 2019-11-01 10:25:31.585984: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1546] Found device 0 with properties:
  29. pciBusID: 0000:86:00.0 name: GeForce RTX 2080 Ti computeCapability: 7.5
  30. coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s
  31. 2019-11-01 10:25:31.587358: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1546] Found device 1 with properties:
  32. pciBusID: 0000:af:00.0 name: GeForce RTX 2080 Ti computeCapability: 7.5
  33. coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s
  34. 2019-11-01 10:25:31.587435: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.0
  35. 2019-11-01 10:25:31.587471: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10.0
  36. 2019-11-01 10:25:31.587503: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10.0
  37. 2019-11-01 10:25:31.587535: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10.0
  38. 2019-11-01 10:25:31.587567: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10.0
  39. 2019-11-01 10:25:31.587598: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10.0
  40. 2019-11-01 10:25:31.587631: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7
  41. 2019-11-01 10:25:31.592892: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1688] Adding visible gpu devices: 0, 1
  42. 2019-11-01 10:25:31.592970: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.0
  43. 2019-11-01 10:25:31.597198: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1087] Device interconnect StreamExecutor with strength 1 edge matrix:
  44. 2019-11-01 10:25:31.597230: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1093] 0 1
  45. 2019-11-01 10:25:31.597248: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1106] 0: N N
  46. 2019-11-01 10:25:31.597263: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1106] 1: N N
  47. 2019-11-01 10:25:31.599301: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1232] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 10312 MB memory) -> physical GPU (device: 0, name: GeForce RTX 2080 Ti, pci bus id: 0000:86:00.0, compute capability: 7.5)
  48. 2019-11-01 10:25:31.600473: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1232] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:1 with 10312 MB memory) -> physical GPU (device: 1, name: GeForce RTX 2080 Ti, pci bus id: 0000:af:00.0, compute capability: 7.5)
  49. 2019-11-01 09:25:31,621 | file_handler | INFO : found 24 datasets
  50. 2019-11-01 09:25:31,624 | file_handler | INFO : loaded evaluation_overview.csv
  51. 2019-11-01 09:25:31,627 | file_handler | INFO : found 272 trainings
  52. 2019-11-01 09:25:31,629 | deeplearning_tools | INFO : Train on trainUid : 8
  53. Model: "resnet50"
  54. __________________________________________________________________________________________________
  55. Layer (type) Output Shape Param # Connected to
  56. ==================================================================================================
  57. input_1 (InputLayer) [(None, 270, 270, 1) 0
  58. __________________________________________________________________________________________________
  59. conv1_pad (ZeroPadding2D) (None, 276, 276, 1) 0 input_1[0][0]
  60. __________________________________________________________________________________________________
  61. conv1_conv (Conv2D) (None, 135, 135, 64) 3200 conv1_pad[0][0]
  62. __________________________________________________________________________________________________
  63. conv1_bn (BatchNormalization) (None, 135, 135, 64) 256 conv1_conv[0][0]
  64. __________________________________________________________________________________________________
  65. conv1_relu (Activation) (None, 135, 135, 64) 0 conv1_bn[0][0]
  66. __________________________________________________________________________________________________
  67. pool1_pad (ZeroPadding2D) (None, 137, 137, 64) 0 conv1_relu[0][0]
  68. __________________________________________________________________________________________________
  69. pool1_pool (MaxPooling2D) (None, 68, 68, 64) 0 pool1_pad[0][0]
  70. __________________________________________________________________________________________________
  71. conv2_block1_1_conv (Conv2D) (None, 68, 68, 64) 4160 pool1_pool[0][0]
  72. __________________________________________________________________________________________________
  73. conv2_block1_1_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block1_1_conv[0][0]
  74. __________________________________________________________________________________________________
  75. conv2_block1_1_relu (Activation (None, 68, 68, 64) 0 conv2_block1_1_bn[0][0]
  76. __________________________________________________________________________________________________
  77. conv2_block1_2_conv (Conv2D) (None, 68, 68, 64) 36928 conv2_block1_1_relu[0][0]
  78. __________________________________________________________________________________________________
  79. conv2_block1_2_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block1_2_conv[0][0]
  80. __________________________________________________________________________________________________
  81. conv2_block1_2_relu (Activation (None, 68, 68, 64) 0 conv2_block1_2_bn[0][0]
  82. __________________________________________________________________________________________________
  83. conv2_block1_0_conv (Conv2D) (None, 68, 68, 256) 16640 pool1_pool[0][0]
  84. __________________________________________________________________________________________________
  85. conv2_block1_3_conv (Conv2D) (None, 68, 68, 256) 16640 conv2_block1_2_relu[0][0]
  86. __________________________________________________________________________________________________
  87. conv2_block1_0_bn (BatchNormali (None, 68, 68, 256) 1024 conv2_block1_0_conv[0][0]
  88. __________________________________________________________________________________________________
  89. conv2_block1_3_bn (BatchNormali (None, 68, 68, 256) 1024 conv2_block1_3_conv[0][0]
  90. __________________________________________________________________________________________________
  91. conv2_block1_add (Add) (None, 68, 68, 256) 0 conv2_block1_0_bn[0][0]
  92. conv2_block1_3_bn[0][0]
  93. __________________________________________________________________________________________________
  94. conv2_block1_out (Activation) (None, 68, 68, 256) 0 conv2_block1_add[0][0]
  95. __________________________________________________________________________________________________
  96. conv2_block2_1_conv (Conv2D) (None, 68, 68, 64) 16448 conv2_block1_out[0][0]
  97. __________________________________________________________________________________________________
  98. conv2_block2_1_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block2_1_conv[0][0]
  99. __________________________________________________________________________________________________
  100. conv2_block2_1_relu (Activation (None, 68, 68, 64) 0 conv2_block2_1_bn[0][0]
  101. __________________________________________________________________________________________________
  102. conv2_block2_2_conv (Conv2D) (None, 68, 68, 64) 36928 conv2_block2_1_relu[0][0]
  103. __________________________________________________________________________________________________
  104. conv2_block2_2_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block2_2_conv[0][0]
  105. __________________________________________________________________________________________________
  106. conv2_block2_2_relu (Activation (None, 68, 68, 64) 0 conv2_block2_2_bn[0][0]
  107. __________________________________________________________________________________________________
  108. conv2_block2_3_conv (Conv2D) (None, 68, 68, 256) 16640 conv2_block2_2_relu[0][0]
  109. __________________________________________________________________________________________________
  110. conv2_block2_3_bn (BatchNormali (None, 68, 68, 256) 1024 conv2_block2_3_conv[0][0]
  111. __________________________________________________________________________________________________
  112. conv2_block2_add (Add) (None, 68, 68, 256) 0 conv2_block1_out[0][0]
  113. conv2_block2_3_bn[0][0]
  114. __________________________________________________________________________________________________
  115. conv2_block2_out (Activation) (None, 68, 68, 256) 0 conv2_block2_add[0][0]
  116. __________________________________________________________________________________________________
  117. conv2_block3_1_conv (Conv2D) (None, 68, 68, 64) 16448 conv2_block2_out[0][0]
  118. __________________________________________________________________________________________________
  119. conv2_block3_1_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block3_1_conv[0][0]
  120. __________________________________________________________________________________________________
  121. conv2_block3_1_relu (Activation (None, 68, 68, 64) 0 conv2_block3_1_bn[0][0]
  122. __________________________________________________________________________________________________
  123. conv2_block3_2_conv (Conv2D) (None, 68, 68, 64) 36928 conv2_block3_1_relu[0][0]
  124. __________________________________________________________________________________________________
  125. conv2_block3_2_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block3_2_conv[0][0]
  126. __________________________________________________________________________________________________
  127. conv2_block3_2_relu (Activation (None, 68, 68, 64) 0 conv2_block3_2_bn[0][0]
  128. __________________________________________________________________________________________________
  129. conv2_block3_3_conv (Conv2D) (None, 68, 68, 256) 16640 conv2_block3_2_relu[0][0]
  130. __________________________________________________________________________________________________
  131. conv2_block3_3_bn (BatchNormali (None, 68, 68, 256) 1024 conv2_block3_3_conv[0][0]
  132. __________________________________________________________________________________________________
  133. conv2_block3_add (Add) (None, 68, 68, 256) 0 conv2_block2_out[0][0]
  134. conv2_block3_3_bn[0][0]
  135. __________________________________________________________________________________________________
  136. conv2_block3_out (Activation) (None, 68, 68, 256) 0 conv2_block3_add[0][0]
  137. __________________________________________________________________________________________________
  138. conv3_block1_1_conv (Conv2D) (None, 34, 34, 128) 32896 conv2_block3_out[0][0]
  139. __________________________________________________________________________________________________
  140. conv3_block1_1_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block1_1_conv[0][0]
  141. __________________________________________________________________________________________________
  142. conv3_block1_1_relu (Activation (None, 34, 34, 128) 0 conv3_block1_1_bn[0][0]
  143. __________________________________________________________________________________________________
  144. conv3_block1_2_conv (Conv2D) (None, 34, 34, 128) 147584 conv3_block1_1_relu[0][0]
  145. __________________________________________________________________________________________________
  146. conv3_block1_2_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block1_2_conv[0][0]
  147. __________________________________________________________________________________________________
  148. conv3_block1_2_relu (Activation (None, 34, 34, 128) 0 conv3_block1_2_bn[0][0]
  149. __________________________________________________________________________________________________
  150. conv3_block1_0_conv (Conv2D) (None, 34, 34, 512) 131584 conv2_block3_out[0][0]
  151. __________________________________________________________________________________________________
  152. conv3_block1_3_conv (Conv2D) (None, 34, 34, 512) 66048 conv3_block1_2_relu[0][0]
  153. __________________________________________________________________________________________________
  154. conv3_block1_0_bn (BatchNormali (None, 34, 34, 512) 2048 conv3_block1_0_conv[0][0]
  155. __________________________________________________________________________________________________
  156. conv3_block1_3_bn (BatchNormali (None, 34, 34, 512) 2048 conv3_block1_3_conv[0][0]
  157. __________________________________________________________________________________________________
  158. conv3_block1_add (Add) (None, 34, 34, 512) 0 conv3_block1_0_bn[0][0]
  159. conv3_block1_3_bn[0][0]
  160. __________________________________________________________________________________________________
  161. conv3_block1_out (Activation) (None, 34, 34, 512) 0 conv3_block1_add[0][0]
  162. __________________________________________________________________________________________________
  163. conv3_block2_1_conv (Conv2D) (None, 34, 34, 128) 65664 conv3_block1_out[0][0]
  164. __________________________________________________________________________________________________
  165. conv3_block2_1_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block2_1_conv[0][0]
  166. __________________________________________________________________________________________________
  167. conv3_block2_1_relu (Activation (None, 34, 34, 128) 0 conv3_block2_1_bn[0][0]
  168. __________________________________________________________________________________________________
  169. conv3_block2_2_conv (Conv2D) (None, 34, 34, 128) 147584 conv3_block2_1_relu[0][0]
  170. __________________________________________________________________________________________________
  171. conv3_block2_2_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block2_2_conv[0][0]
  172. __________________________________________________________________________________________________
  173. conv3_block2_2_relu (Activation (None, 34, 34, 128) 0 conv3_block2_2_bn[0][0]
  174. __________________________________________________________________________________________________
  175. conv3_block2_3_conv (Conv2D) (None, 34, 34, 512) 66048 conv3_block2_2_relu[0][0]
  176. __________________________________________________________________________________________________
  177. conv3_block2_3_bn (BatchNormali (None, 34, 34, 512) 2048 conv3_block2_3_conv[0][0]
  178. __________________________________________________________________________________________________
  179. conv3_block2_add (Add) (None, 34, 34, 512) 0 conv3_block1_out[0][0]
  180. conv3_block2_3_bn[0][0]
  181. __________________________________________________________________________________________________
  182. conv3_block2_out (Activation) (None, 34, 34, 512) 0 conv3_block2_add[0][0]
  183. __________________________________________________________________________________________________
  184. conv3_block3_1_conv (Conv2D) (None, 34, 34, 128) 65664 conv3_block2_out[0][0]
  185. __________________________________________________________________________________________________
  186. conv3_block3_1_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block3_1_conv[0][0]
  187. __________________________________________________________________________________________________
  188. conv3_block3_1_relu (Activation (None, 34, 34, 128) 0 conv3_block3_1_bn[0][0]
  189. __________________________________________________________________________________________________
  190. conv3_block3_2_conv (Conv2D) (None, 34, 34, 128) 147584 conv3_block3_1_relu[0][0]
  191. __________________________________________________________________________________________________
  192. conv3_block3_2_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block3_2_conv[0][0]
  193. __________________________________________________________________________________________________
  194. conv3_block3_2_relu (Activation (None, 34, 34, 128) 0 conv3_block3_2_bn[0][0]
  195. __________________________________________________________________________________________________
  196. conv3_block3_3_conv (Conv2D) (None, 34, 34, 512) 66048 conv3_block3_2_relu[0][0]
  197. __________________________________________________________________________________________________
  198. conv3_block3_3_bn (BatchNormali (None, 34, 34, 512) 2048 conv3_block3_3_conv[0][0]
  199. __________________________________________________________________________________________________
  200. conv3_block3_add (Add) (None, 34, 34, 512) 0 conv3_block2_out[0][0]
  201. conv3_block3_3_bn[0][0]
  202. __________________________________________________________________________________________________
  203. conv3_block3_out (Activation) (None, 34, 34, 512) 0 conv3_block3_add[0][0]
  204. __________________________________________________________________________________________________
  205. conv3_block4_1_conv (Conv2D) (None, 34, 34, 128) 65664 conv3_block3_out[0][0]
  206. __________________________________________________________________________________________________
  207. conv3_block4_1_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block4_1_conv[0][0]
  208. __________________________________________________________________________________________________
  209. conv3_block4_1_relu (Activation (None, 34, 34, 128) 0 conv3_block4_1_bn[0][0]
  210. __________________________________________________________________________________________________
  211. conv3_block4_2_conv (Conv2D) (None, 34, 34, 128) 147584 conv3_block4_1_relu[0][0]
  212. __________________________________________________________________________________________________
  213. conv3_block4_2_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block4_2_conv[0][0]
  214. __________________________________________________________________________________________________
  215. conv3_block4_2_relu (Activation (None, 34, 34, 128) 0 conv3_block4_2_bn[0][0]
  216. __________________________________________________________________________________________________
  217. conv3_block4_3_conv (Conv2D) (None, 34, 34, 512) 66048 conv3_block4_2_relu[0][0]
  218. __________________________________________________________________________________________________
  219. conv3_block4_3_bn (BatchNormali (None, 34, 34, 512) 2048 conv3_block4_3_conv[0][0]
  220. __________________________________________________________________________________________________
  221. conv3_block4_add (Add) (None, 34, 34, 512) 0 conv3_block3_out[0][0]
  222. conv3_block4_3_bn[0][0]
  223. __________________________________________________________________________________________________
  224. conv3_block4_out (Activation) (None, 34, 34, 512) 0 conv3_block4_add[0][0]
  225. __________________________________________________________________________________________________
  226. conv4_block1_1_conv (Conv2D) (None, 17, 17, 256) 131328 conv3_block4_out[0][0]
  227. __________________________________________________________________________________________________
  228. conv4_block1_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block1_1_conv[0][0]
  229. __________________________________________________________________________________________________
  230. conv4_block1_1_relu (Activation (None, 17, 17, 256) 0 conv4_block1_1_bn[0][0]
  231. __________________________________________________________________________________________________
  232. conv4_block1_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block1_1_relu[0][0]
  233. __________________________________________________________________________________________________
  234. conv4_block1_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block1_2_conv[0][0]
  235. __________________________________________________________________________________________________
  236. conv4_block1_2_relu (Activation (None, 17, 17, 256) 0 conv4_block1_2_bn[0][0]
  237. __________________________________________________________________________________________________
  238. conv4_block1_0_conv (Conv2D) (None, 17, 17, 1024) 525312 conv3_block4_out[0][0]
  239. __________________________________________________________________________________________________
  240. conv4_block1_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block1_2_relu[0][0]
  241. __________________________________________________________________________________________________
  242. conv4_block1_0_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block1_0_conv[0][0]
  243. __________________________________________________________________________________________________
  244. conv4_block1_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block1_3_conv[0][0]
  245. __________________________________________________________________________________________________
  246. conv4_block1_add (Add) (None, 17, 17, 1024) 0 conv4_block1_0_bn[0][0]
  247. conv4_block1_3_bn[0][0]
  248. __________________________________________________________________________________________________
  249. conv4_block1_out (Activation) (None, 17, 17, 1024) 0 conv4_block1_add[0][0]
  250. __________________________________________________________________________________________________
  251. conv4_block2_1_conv (Conv2D) (None, 17, 17, 256) 262400 conv4_block1_out[0][0]
  252. __________________________________________________________________________________________________
  253. conv4_block2_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block2_1_conv[0][0]
  254. __________________________________________________________________________________________________
  255. conv4_block2_1_relu (Activation (None, 17, 17, 256) 0 conv4_block2_1_bn[0][0]
  256. __________________________________________________________________________________________________
  257. conv4_block2_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block2_1_relu[0][0]
  258. __________________________________________________________________________________________________
  259. conv4_block2_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block2_2_conv[0][0]
  260. __________________________________________________________________________________________________
  261. conv4_block2_2_relu (Activation (None, 17, 17, 256) 0 conv4_block2_2_bn[0][0]
  262. __________________________________________________________________________________________________
  263. conv4_block2_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block2_2_relu[0][0]
  264. __________________________________________________________________________________________________
  265. conv4_block2_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block2_3_conv[0][0]
  266. __________________________________________________________________________________________________
  267. conv4_block2_add (Add) (None, 17, 17, 1024) 0 conv4_block1_out[0][0]
  268. conv4_block2_3_bn[0][0]
  269. __________________________________________________________________________________________________
  270. conv4_block2_out (Activation) (None, 17, 17, 1024) 0 conv4_block2_add[0][0]
  271. __________________________________________________________________________________________________
  272. conv4_block3_1_conv (Conv2D) (None, 17, 17, 256) 262400 conv4_block2_out[0][0]
  273. __________________________________________________________________________________________________
  274. conv4_block3_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block3_1_conv[0][0]
  275. __________________________________________________________________________________________________
  276. conv4_block3_1_relu (Activation (None, 17, 17, 256) 0 conv4_block3_1_bn[0][0]
  277. __________________________________________________________________________________________________
  278. conv4_block3_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block3_1_relu[0][0]
  279. __________________________________________________________________________________________________
  280. conv4_block3_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block3_2_conv[0][0]
  281. __________________________________________________________________________________________________
  282. conv4_block3_2_relu (Activation (None, 17, 17, 256) 0 conv4_block3_2_bn[0][0]
  283. __________________________________________________________________________________________________
  284. conv4_block3_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block3_2_relu[0][0]
  285. __________________________________________________________________________________________________
  286. conv4_block3_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block3_3_conv[0][0]
  287. __________________________________________________________________________________________________
  288. conv4_block3_add (Add) (None, 17, 17, 1024) 0 conv4_block2_out[0][0]
  289. conv4_block3_3_bn[0][0]
  290. __________________________________________________________________________________________________
  291. conv4_block3_out (Activation) (None, 17, 17, 1024) 0 conv4_block3_add[0][0]
  292. __________________________________________________________________________________________________
  293. conv4_block4_1_conv (Conv2D) (None, 17, 17, 256) 262400 conv4_block3_out[0][0]
  294. __________________________________________________________________________________________________
  295. conv4_block4_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block4_1_conv[0][0]
  296. __________________________________________________________________________________________________
  297. conv4_block4_1_relu (Activation (None, 17, 17, 256) 0 conv4_block4_1_bn[0][0]
  298. __________________________________________________________________________________________________
  299. conv4_block4_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block4_1_relu[0][0]
  300. __________________________________________________________________________________________________
  301. conv4_block4_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block4_2_conv[0][0]
  302. __________________________________________________________________________________________________
  303. conv4_block4_2_relu (Activation (None, 17, 17, 256) 0 conv4_block4_2_bn[0][0]
  304. __________________________________________________________________________________________________
  305. conv4_block4_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block4_2_relu[0][0]
  306. __________________________________________________________________________________________________
  307. conv4_block4_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block4_3_conv[0][0]
  308. __________________________________________________________________________________________________
  309. conv4_block4_add (Add) (None, 17, 17, 1024) 0 conv4_block3_out[0][0]
  310. conv4_block4_3_bn[0][0]
  311. __________________________________________________________________________________________________
  312. conv4_block4_out (Activation) (None, 17, 17, 1024) 0 conv4_block4_add[0][0]
  313. __________________________________________________________________________________________________
  314. conv4_block5_1_conv (Conv2D) (None, 17, 17, 256) 262400 conv4_block4_out[0][0]
  315. __________________________________________________________________________________________________
  316. conv4_block5_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block5_1_conv[0][0]
  317. __________________________________________________________________________________________________
  318. conv4_block5_1_relu (Activation (None, 17, 17, 256) 0 conv4_block5_1_bn[0][0]
  319. __________________________________________________________________________________________________
  320. conv4_block5_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block5_1_relu[0][0]
  321. __________________________________________________________________________________________________
  322. conv4_block5_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block5_2_conv[0][0]
  323. __________________________________________________________________________________________________
  324. conv4_block5_2_relu (Activation (None, 17, 17, 256) 0 conv4_block5_2_bn[0][0]
  325. __________________________________________________________________________________________________
  326. conv4_block5_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block5_2_relu[0][0]
  327. __________________________________________________________________________________________________
  328. conv4_block5_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block5_3_conv[0][0]
  329. __________________________________________________________________________________________________
  330. conv4_block5_add (Add) (None, 17, 17, 1024) 0 conv4_block4_out[0][0]
  331. conv4_block5_3_bn[0][0]
  332. __________________________________________________________________________________________________
  333. conv4_block5_out (Activation) (None, 17, 17, 1024) 0 conv4_block5_add[0][0]
  334. __________________________________________________________________________________________________
  335. conv4_block6_1_conv (Conv2D) (None, 17, 17, 256) 262400 conv4_block5_out[0][0]
  336. __________________________________________________________________________________________________
  337. conv4_block6_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block6_1_conv[0][0]
  338. __________________________________________________________________________________________________
  339. conv4_block6_1_relu (Activation (None, 17, 17, 256) 0 conv4_block6_1_bn[0][0]
  340. __________________________________________________________________________________________________
  341. conv4_block6_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block6_1_relu[0][0]
  342. __________________________________________________________________________________________________
  343. conv4_block6_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block6_2_conv[0][0]
  344. __________________________________________________________________________________________________
  345. conv4_block6_2_relu (Activation (None, 17, 17, 256) 0 conv4_block6_2_bn[0][0]
  346. __________________________________________________________________________________________________
  347. conv4_block6_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block6_2_relu[0][0]
  348. __________________________________________________________________________________________________
  349. conv4_block6_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block6_3_conv[0][0]
  350. __________________________________________________________________________________________________
  351. conv4_block6_add (Add) (None, 17, 17, 1024) 0 conv4_block5_out[0][0]
  352. conv4_block6_3_bn[0][0]
  353. __________________________________________________________________________________________________
  354. conv4_block6_out (Activation) (None, 17, 17, 1024) 0 conv4_block6_add[0][0]
  355. __________________________________________________________________________________________________
  356. conv5_block1_1_conv (Conv2D) (None, 9, 9, 512) 524800 conv4_block6_out[0][0]
  357. __________________________________________________________________________________________________
  358. conv5_block1_1_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block1_1_conv[0][0]
  359. __________________________________________________________________________________________________
  360. conv5_block1_1_relu (Activation (None, 9, 9, 512) 0 conv5_block1_1_bn[0][0]
  361. __________________________________________________________________________________________________
  362. conv5_block1_2_conv (Conv2D) (None, 9, 9, 512) 2359808 conv5_block1_1_relu[0][0]
  363. __________________________________________________________________________________________________
  364. conv5_block1_2_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block1_2_conv[0][0]
  365. __________________________________________________________________________________________________
  366. conv5_block1_2_relu (Activation (None, 9, 9, 512) 0 conv5_block1_2_bn[0][0]
  367. __________________________________________________________________________________________________
  368. conv5_block1_0_conv (Conv2D) (None, 9, 9, 2048) 2099200 conv4_block6_out[0][0]
  369. __________________________________________________________________________________________________
  370. conv5_block1_3_conv (Conv2D) (None, 9, 9, 2048) 1050624 conv5_block1_2_relu[0][0]
  371. __________________________________________________________________________________________________
  372. conv5_block1_0_bn (BatchNormali (None, 9, 9, 2048) 8192 conv5_block1_0_conv[0][0]
  373. __________________________________________________________________________________________________
  374. conv5_block1_3_bn (BatchNormali (None, 9, 9, 2048) 8192 conv5_block1_3_conv[0][0]
  375. __________________________________________________________________________________________________
  376. conv5_block1_add (Add) (None, 9, 9, 2048) 0 conv5_block1_0_bn[0][0]
  377. conv5_block1_3_bn[0][0]
  378. __________________________________________________________________________________________________
  379. conv5_block1_out (Activation) (None, 9, 9, 2048) 0 conv5_block1_add[0][0]
  380. __________________________________________________________________________________________________
  381. conv5_block2_1_conv (Conv2D) (None, 9, 9, 512) 1049088 conv5_block1_out[0][0]
  382. __________________________________________________________________________________________________
  383. conv5_block2_1_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block2_1_conv[0][0]
  384. __________________________________________________________________________________________________
  385. conv5_block2_1_relu (Activation (None, 9, 9, 512) 0 conv5_block2_1_bn[0][0]
  386. __________________________________________________________________________________________________
  387. conv5_block2_2_conv (Conv2D) (None, 9, 9, 512) 2359808 conv5_block2_1_relu[0][0]
  388. __________________________________________________________________________________________________
  389. conv5_block2_2_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block2_2_conv[0][0]
  390. __________________________________________________________________________________________________
  391. conv5_block2_2_relu (Activation (None, 9, 9, 512) 0 conv5_block2_2_bn[0][0]
  392. __________________________________________________________________________________________________
  393. conv5_block2_3_conv (Conv2D) (None, 9, 9, 2048) 1050624 conv5_block2_2_relu[0][0]
  394. __________________________________________________________________________________________________
  395. conv5_block2_3_bn (BatchNormali (None, 9, 9, 2048) 8192 conv5_block2_3_conv[0][0]
  396. __________________________________________________________________________________________________
  397. conv5_block2_add (Add) (None, 9, 9, 2048) 0 conv5_block1_out[0][0]
  398. conv5_block2_3_bn[0][0]
  399. __________________________________________________________________________________________________
  400. conv5_block2_out (Activation) (None, 9, 9, 2048) 0 conv5_block2_add[0][0]
  401. __________________________________________________________________________________________________
  402. conv5_block3_1_conv (Conv2D) (None, 9, 9, 512) 1049088 conv5_block2_out[0][0]
  403. __________________________________________________________________________________________________
  404. conv5_block3_1_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block3_1_conv[0][0]
  405. __________________________________________________________________________________________________
  406. conv5_block3_1_relu (Activation (None, 9, 9, 512) 0 conv5_block3_1_bn[0][0]
  407. __________________________________________________________________________________________________
  408. conv5_block3_2_conv (Conv2D) (None, 9, 9, 512) 2359808 conv5_block3_1_relu[0][0]
  409. __________________________________________________________________________________________________
  410. conv5_block3_2_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block3_2_conv[0][0]
  411. __________________________________________________________________________________________________
  412. conv5_block3_2_relu (Activation (None, 9, 9, 512) 0 conv5_block3_2_bn[0][0]
  413. __________________________________________________________________________________________________
  414. conv5_block3_3_conv (Conv2D) (None, 9, 9, 2048) 1050624 conv5_block3_2_relu[0][0]
  415. __________________________________________________________________________________________________
  416. conv5_block3_3_bn (BatchNormali (None, 9, 9, 2048) 8192 conv5_block3_3_conv[0][0]
  417. __________________________________________________________________________________________________
  418. conv5_block3_add (Add) (None, 9, 9, 2048) 0 conv5_block2_out[0][0]
  419. conv5_block3_3_bn[0][0]
  420. __________________________________________________________________________________________________
  421. conv5_block3_out (Activation) (None, 9, 9, 2048) 0 conv5_block3_add[0][0]
  422. __________________________________________________________________________________________________
  423. avg_pool (GlobalAveragePooling2 (None, 2048) 0 conv5_block3_out[0][0]
  424. __________________________________________________________________________________________________
  425. probs (Dense) (None, 3) 6147 avg_pool[0][0]
  426. ==================================================================================================
  427. Total params: 23,587,587
  428. Trainable params: 23,534,467
  429. Non-trainable params: 53,120
  430. __________________________________________________________________________________________________
  431. Train for 563.0 steps, validate for 179.0 steps
  432. Epoch 1/10
  433. 2019-11-01 10:25:52.942924: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10.0
  434. 2019-11-01 10:25:54.882248: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7
  435. 563/563 [==============================] - 119s 211ms/step - loss: 1.0535 - accuracy: 0.5099 - val_loss: 1.0735 - val_accuracy: 0.6682
  436. Epoch 2/10
  437. 563/563 [==============================] - 95s 169ms/step - loss: 1.0123 - accuracy: 0.5277 - val_loss: 1.0721 - val_accuracy: 0.6682
  438. Epoch 3/10
  439. 563/563 [==============================] - 95s 169ms/step - loss: 1.0121 - accuracy: 0.5277 - val_loss: 1.0709 - val_accuracy: 0.6682
  440. Epoch 4/10
  441. 563/563 [==============================] - 95s 169ms/step - loss: 1.0124 - accuracy: 0.5277 - val_loss: 1.0667 - val_accuracy: 0.6682
  442. Epoch 5/10
  443. 563/563 [==============================] - 95s 169ms/step - loss: 1.0121 - accuracy: 0.5277 - val_loss: 1.0687 - val_accuracy: 0.6682
  444. Epoch 6/10
  445. 563/563 [==============================] - 95s 168ms/step - loss: 1.0125 - accuracy: 0.5277 - val_loss: 1.0638 - val_accuracy: 0.6682
  446. Epoch 7/10
  447. 563/563 [==============================] - 94s 167ms/step - loss: 1.0125 - accuracy: 0.5277 - val_loss: 1.0639 - val_accuracy: 0.6682
  448. Epoch 8/10
  449. 400/563 [====================>.........] - ETA: 24s - loss: 1.0135 - accuracy: 0.5268
Add Comment
Please, Sign In to add comment