Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- Output with MirroredStrategy
- ds_valid.shuffle(len(list_paths_valid), seed=42,
- reshuffle_each_iteration=True)
- 2019-11-01 10:25:31.215552: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1
- 2019-11-01 10:25:31.295080: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1546] Found device 0 with properties:
- pciBusID: 0000:86:00.0 name: GeForce RTX 2080 Ti computeCapability: 7.5
- coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s
- 2019-11-01 10:25:31.295801: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1546] Found device 1 with properties:
- pciBusID: 0000:af:00.0 name: GeForce RTX 2080 Ti computeCapability: 7.5
- coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s
- 2019-11-01 10:25:31.296119: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.0
- 2019-11-01 10:25:31.298252: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10.0
- 2019-11-01 10:25:31.299728: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10.0
- 2019-11-01 10:25:31.300093: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10.0
- 2019-11-01 10:25:31.302095: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10.0
- 2019-11-01 10:25:31.303592: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10.0
- 2019-11-01 10:25:31.308127: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7
- 2019-11-01 10:25:31.310488: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1688] Adding visible gpu devices: 0, 1
- 2019-11-01 10:25:31.310882: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 AVX512F FMA
- 2019-11-01 10:25:31.354089: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 2100000000 Hz
- 2019-11-01 10:25:31.365967: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x57ab6f0 initialized for platform Host (this does not guarantee that XLA will be used). Devices:
- 2019-11-01 10:25:31.366024: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version
- 2019-11-01 10:25:31.584146: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x53772a0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
- 2019-11-01 10:25:31.584201: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): GeForce RTX 2080 Ti, Compute Capability 7.5
- 2019-11-01 10:25:31.584216: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (1): GeForce RTX 2080 Ti, Compute Capability 7.5
- 2019-11-01 10:25:31.585984: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1546] Found device 0 with properties:
- pciBusID: 0000:86:00.0 name: GeForce RTX 2080 Ti computeCapability: 7.5
- coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s
- 2019-11-01 10:25:31.587358: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1546] Found device 1 with properties:
- pciBusID: 0000:af:00.0 name: GeForce RTX 2080 Ti computeCapability: 7.5
- coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s
- 2019-11-01 10:25:31.587435: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.0
- 2019-11-01 10:25:31.587471: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10.0
- 2019-11-01 10:25:31.587503: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10.0
- 2019-11-01 10:25:31.587535: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10.0
- 2019-11-01 10:25:31.587567: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10.0
- 2019-11-01 10:25:31.587598: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10.0
- 2019-11-01 10:25:31.587631: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7
- 2019-11-01 10:25:31.592892: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1688] Adding visible gpu devices: 0, 1
- 2019-11-01 10:25:31.592970: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.0
- 2019-11-01 10:25:31.597198: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1087] Device interconnect StreamExecutor with strength 1 edge matrix:
- 2019-11-01 10:25:31.597230: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1093] 0 1
- 2019-11-01 10:25:31.597248: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1106] 0: N N
- 2019-11-01 10:25:31.597263: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1106] 1: N N
- 2019-11-01 10:25:31.599301: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1232] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 10312 MB memory) -> physical GPU (device: 0, name: GeForce RTX 2080 Ti, pci bus id: 0000:86:00.0, compute capability: 7.5)
- 2019-11-01 10:25:31.600473: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1232] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:1 with 10312 MB memory) -> physical GPU (device: 1, name: GeForce RTX 2080 Ti, pci bus id: 0000:af:00.0, compute capability: 7.5)
- 2019-11-01 09:25:31,621 | file_handler | INFO : found 24 datasets
- 2019-11-01 09:25:31,624 | file_handler | INFO : loaded evaluation_overview.csv
- 2019-11-01 09:25:31,627 | file_handler | INFO : found 272 trainings
- 2019-11-01 09:25:31,629 | deeplearning_tools | INFO : Train on trainUid : 8
- Model: "resnet50"
- __________________________________________________________________________________________________
- Layer (type) Output Shape Param # Connected to
- ==================================================================================================
- input_1 (InputLayer) [(None, 270, 270, 1) 0
- __________________________________________________________________________________________________
- conv1_pad (ZeroPadding2D) (None, 276, 276, 1) 0 input_1[0][0]
- __________________________________________________________________________________________________
- conv1_conv (Conv2D) (None, 135, 135, 64) 3200 conv1_pad[0][0]
- __________________________________________________________________________________________________
- conv1_bn (BatchNormalization) (None, 135, 135, 64) 256 conv1_conv[0][0]
- __________________________________________________________________________________________________
- conv1_relu (Activation) (None, 135, 135, 64) 0 conv1_bn[0][0]
- __________________________________________________________________________________________________
- pool1_pad (ZeroPadding2D) (None, 137, 137, 64) 0 conv1_relu[0][0]
- __________________________________________________________________________________________________
- pool1_pool (MaxPooling2D) (None, 68, 68, 64) 0 pool1_pad[0][0]
- __________________________________________________________________________________________________
- conv2_block1_1_conv (Conv2D) (None, 68, 68, 64) 4160 pool1_pool[0][0]
- __________________________________________________________________________________________________
- conv2_block1_1_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block1_1_conv[0][0]
- __________________________________________________________________________________________________
- conv2_block1_1_relu (Activation (None, 68, 68, 64) 0 conv2_block1_1_bn[0][0]
- __________________________________________________________________________________________________
- conv2_block1_2_conv (Conv2D) (None, 68, 68, 64) 36928 conv2_block1_1_relu[0][0]
- __________________________________________________________________________________________________
- conv2_block1_2_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block1_2_conv[0][0]
- __________________________________________________________________________________________________
- conv2_block1_2_relu (Activation (None, 68, 68, 64) 0 conv2_block1_2_bn[0][0]
- __________________________________________________________________________________________________
- conv2_block1_0_conv (Conv2D) (None, 68, 68, 256) 16640 pool1_pool[0][0]
- __________________________________________________________________________________________________
- conv2_block1_3_conv (Conv2D) (None, 68, 68, 256) 16640 conv2_block1_2_relu[0][0]
- __________________________________________________________________________________________________
- conv2_block1_0_bn (BatchNormali (None, 68, 68, 256) 1024 conv2_block1_0_conv[0][0]
- __________________________________________________________________________________________________
- conv2_block1_3_bn (BatchNormali (None, 68, 68, 256) 1024 conv2_block1_3_conv[0][0]
- __________________________________________________________________________________________________
- conv2_block1_add (Add) (None, 68, 68, 256) 0 conv2_block1_0_bn[0][0]
- conv2_block1_3_bn[0][0]
- __________________________________________________________________________________________________
- conv2_block1_out (Activation) (None, 68, 68, 256) 0 conv2_block1_add[0][0]
- __________________________________________________________________________________________________
- conv2_block2_1_conv (Conv2D) (None, 68, 68, 64) 16448 conv2_block1_out[0][0]
- __________________________________________________________________________________________________
- conv2_block2_1_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block2_1_conv[0][0]
- __________________________________________________________________________________________________
- conv2_block2_1_relu (Activation (None, 68, 68, 64) 0 conv2_block2_1_bn[0][0]
- __________________________________________________________________________________________________
- conv2_block2_2_conv (Conv2D) (None, 68, 68, 64) 36928 conv2_block2_1_relu[0][0]
- __________________________________________________________________________________________________
- conv2_block2_2_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block2_2_conv[0][0]
- __________________________________________________________________________________________________
- conv2_block2_2_relu (Activation (None, 68, 68, 64) 0 conv2_block2_2_bn[0][0]
- __________________________________________________________________________________________________
- conv2_block2_3_conv (Conv2D) (None, 68, 68, 256) 16640 conv2_block2_2_relu[0][0]
- __________________________________________________________________________________________________
- conv2_block2_3_bn (BatchNormali (None, 68, 68, 256) 1024 conv2_block2_3_conv[0][0]
- __________________________________________________________________________________________________
- conv2_block2_add (Add) (None, 68, 68, 256) 0 conv2_block1_out[0][0]
- conv2_block2_3_bn[0][0]
- __________________________________________________________________________________________________
- conv2_block2_out (Activation) (None, 68, 68, 256) 0 conv2_block2_add[0][0]
- __________________________________________________________________________________________________
- conv2_block3_1_conv (Conv2D) (None, 68, 68, 64) 16448 conv2_block2_out[0][0]
- __________________________________________________________________________________________________
- conv2_block3_1_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block3_1_conv[0][0]
- __________________________________________________________________________________________________
- conv2_block3_1_relu (Activation (None, 68, 68, 64) 0 conv2_block3_1_bn[0][0]
- __________________________________________________________________________________________________
- conv2_block3_2_conv (Conv2D) (None, 68, 68, 64) 36928 conv2_block3_1_relu[0][0]
- __________________________________________________________________________________________________
- conv2_block3_2_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block3_2_conv[0][0]
- __________________________________________________________________________________________________
- conv2_block3_2_relu (Activation (None, 68, 68, 64) 0 conv2_block3_2_bn[0][0]
- __________________________________________________________________________________________________
- conv2_block3_3_conv (Conv2D) (None, 68, 68, 256) 16640 conv2_block3_2_relu[0][0]
- __________________________________________________________________________________________________
- conv2_block3_3_bn (BatchNormali (None, 68, 68, 256) 1024 conv2_block3_3_conv[0][0]
- __________________________________________________________________________________________________
- conv2_block3_add (Add) (None, 68, 68, 256) 0 conv2_block2_out[0][0]
- conv2_block3_3_bn[0][0]
- __________________________________________________________________________________________________
- conv2_block3_out (Activation) (None, 68, 68, 256) 0 conv2_block3_add[0][0]
- __________________________________________________________________________________________________
- conv3_block1_1_conv (Conv2D) (None, 34, 34, 128) 32896 conv2_block3_out[0][0]
- __________________________________________________________________________________________________
- conv3_block1_1_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block1_1_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block1_1_relu (Activation (None, 34, 34, 128) 0 conv3_block1_1_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block1_2_conv (Conv2D) (None, 34, 34, 128) 147584 conv3_block1_1_relu[0][0]
- __________________________________________________________________________________________________
- conv3_block1_2_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block1_2_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block1_2_relu (Activation (None, 34, 34, 128) 0 conv3_block1_2_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block1_0_conv (Conv2D) (None, 34, 34, 512) 131584 conv2_block3_out[0][0]
- __________________________________________________________________________________________________
- conv3_block1_3_conv (Conv2D) (None, 34, 34, 512) 66048 conv3_block1_2_relu[0][0]
- __________________________________________________________________________________________________
- conv3_block1_0_bn (BatchNormali (None, 34, 34, 512) 2048 conv3_block1_0_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block1_3_bn (BatchNormali (None, 34, 34, 512) 2048 conv3_block1_3_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block1_add (Add) (None, 34, 34, 512) 0 conv3_block1_0_bn[0][0]
- conv3_block1_3_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block1_out (Activation) (None, 34, 34, 512) 0 conv3_block1_add[0][0]
- __________________________________________________________________________________________________
- conv3_block2_1_conv (Conv2D) (None, 34, 34, 128) 65664 conv3_block1_out[0][0]
- __________________________________________________________________________________________________
- conv3_block2_1_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block2_1_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block2_1_relu (Activation (None, 34, 34, 128) 0 conv3_block2_1_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block2_2_conv (Conv2D) (None, 34, 34, 128) 147584 conv3_block2_1_relu[0][0]
- __________________________________________________________________________________________________
- conv3_block2_2_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block2_2_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block2_2_relu (Activation (None, 34, 34, 128) 0 conv3_block2_2_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block2_3_conv (Conv2D) (None, 34, 34, 512) 66048 conv3_block2_2_relu[0][0]
- __________________________________________________________________________________________________
- conv3_block2_3_bn (BatchNormali (None, 34, 34, 512) 2048 conv3_block2_3_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block2_add (Add) (None, 34, 34, 512) 0 conv3_block1_out[0][0]
- conv3_block2_3_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block2_out (Activation) (None, 34, 34, 512) 0 conv3_block2_add[0][0]
- __________________________________________________________________________________________________
- conv3_block3_1_conv (Conv2D) (None, 34, 34, 128) 65664 conv3_block2_out[0][0]
- __________________________________________________________________________________________________
- conv3_block3_1_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block3_1_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block3_1_relu (Activation (None, 34, 34, 128) 0 conv3_block3_1_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block3_2_conv (Conv2D) (None, 34, 34, 128) 147584 conv3_block3_1_relu[0][0]
- __________________________________________________________________________________________________
- conv3_block3_2_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block3_2_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block3_2_relu (Activation (None, 34, 34, 128) 0 conv3_block3_2_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block3_3_conv (Conv2D) (None, 34, 34, 512) 66048 conv3_block3_2_relu[0][0]
- __________________________________________________________________________________________________
- conv3_block3_3_bn (BatchNormali (None, 34, 34, 512) 2048 conv3_block3_3_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block3_add (Add) (None, 34, 34, 512) 0 conv3_block2_out[0][0]
- conv3_block3_3_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block3_out (Activation) (None, 34, 34, 512) 0 conv3_block3_add[0][0]
- __________________________________________________________________________________________________
- conv3_block4_1_conv (Conv2D) (None, 34, 34, 128) 65664 conv3_block3_out[0][0]
- __________________________________________________________________________________________________
- conv3_block4_1_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block4_1_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block4_1_relu (Activation (None, 34, 34, 128) 0 conv3_block4_1_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block4_2_conv (Conv2D) (None, 34, 34, 128) 147584 conv3_block4_1_relu[0][0]
- __________________________________________________________________________________________________
- conv3_block4_2_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block4_2_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block4_2_relu (Activation (None, 34, 34, 128) 0 conv3_block4_2_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block4_3_conv (Conv2D) (None, 34, 34, 512) 66048 conv3_block4_2_relu[0][0]
- __________________________________________________________________________________________________
- conv3_block4_3_bn (BatchNormali (None, 34, 34, 512) 2048 conv3_block4_3_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block4_add (Add) (None, 34, 34, 512) 0 conv3_block3_out[0][0]
- conv3_block4_3_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block4_out (Activation) (None, 34, 34, 512) 0 conv3_block4_add[0][0]
- __________________________________________________________________________________________________
- conv4_block1_1_conv (Conv2D) (None, 17, 17, 256) 131328 conv3_block4_out[0][0]
- __________________________________________________________________________________________________
- conv4_block1_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block1_1_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block1_1_relu (Activation (None, 17, 17, 256) 0 conv4_block1_1_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block1_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block1_1_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block1_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block1_2_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block1_2_relu (Activation (None, 17, 17, 256) 0 conv4_block1_2_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block1_0_conv (Conv2D) (None, 17, 17, 1024) 525312 conv3_block4_out[0][0]
- __________________________________________________________________________________________________
- conv4_block1_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block1_2_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block1_0_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block1_0_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block1_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block1_3_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block1_add (Add) (None, 17, 17, 1024) 0 conv4_block1_0_bn[0][0]
- conv4_block1_3_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block1_out (Activation) (None, 17, 17, 1024) 0 conv4_block1_add[0][0]
- __________________________________________________________________________________________________
- conv4_block2_1_conv (Conv2D) (None, 17, 17, 256) 262400 conv4_block1_out[0][0]
- __________________________________________________________________________________________________
- conv4_block2_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block2_1_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block2_1_relu (Activation (None, 17, 17, 256) 0 conv4_block2_1_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block2_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block2_1_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block2_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block2_2_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block2_2_relu (Activation (None, 17, 17, 256) 0 conv4_block2_2_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block2_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block2_2_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block2_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block2_3_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block2_add (Add) (None, 17, 17, 1024) 0 conv4_block1_out[0][0]
- conv4_block2_3_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block2_out (Activation) (None, 17, 17, 1024) 0 conv4_block2_add[0][0]
- __________________________________________________________________________________________________
- conv4_block3_1_conv (Conv2D) (None, 17, 17, 256) 262400 conv4_block2_out[0][0]
- __________________________________________________________________________________________________
- conv4_block3_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block3_1_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block3_1_relu (Activation (None, 17, 17, 256) 0 conv4_block3_1_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block3_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block3_1_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block3_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block3_2_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block3_2_relu (Activation (None, 17, 17, 256) 0 conv4_block3_2_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block3_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block3_2_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block3_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block3_3_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block3_add (Add) (None, 17, 17, 1024) 0 conv4_block2_out[0][0]
- conv4_block3_3_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block3_out (Activation) (None, 17, 17, 1024) 0 conv4_block3_add[0][0]
- __________________________________________________________________________________________________
- conv4_block4_1_conv (Conv2D) (None, 17, 17, 256) 262400 conv4_block3_out[0][0]
- __________________________________________________________________________________________________
- conv4_block4_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block4_1_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block4_1_relu (Activation (None, 17, 17, 256) 0 conv4_block4_1_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block4_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block4_1_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block4_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block4_2_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block4_2_relu (Activation (None, 17, 17, 256) 0 conv4_block4_2_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block4_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block4_2_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block4_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block4_3_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block4_add (Add) (None, 17, 17, 1024) 0 conv4_block3_out[0][0]
- conv4_block4_3_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block4_out (Activation) (None, 17, 17, 1024) 0 conv4_block4_add[0][0]
- __________________________________________________________________________________________________
- conv4_block5_1_conv (Conv2D) (None, 17, 17, 256) 262400 conv4_block4_out[0][0]
- __________________________________________________________________________________________________
- conv4_block5_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block5_1_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block5_1_relu (Activation (None, 17, 17, 256) 0 conv4_block5_1_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block5_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block5_1_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block5_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block5_2_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block5_2_relu (Activation (None, 17, 17, 256) 0 conv4_block5_2_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block5_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block5_2_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block5_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block5_3_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block5_add (Add) (None, 17, 17, 1024) 0 conv4_block4_out[0][0]
- conv4_block5_3_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block5_out (Activation) (None, 17, 17, 1024) 0 conv4_block5_add[0][0]
- __________________________________________________________________________________________________
- conv4_block6_1_conv (Conv2D) (None, 17, 17, 256) 262400 conv4_block5_out[0][0]
- __________________________________________________________________________________________________
- conv4_block6_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block6_1_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block6_1_relu (Activation (None, 17, 17, 256) 0 conv4_block6_1_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block6_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block6_1_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block6_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block6_2_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block6_2_relu (Activation (None, 17, 17, 256) 0 conv4_block6_2_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block6_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block6_2_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block6_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block6_3_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block6_add (Add) (None, 17, 17, 1024) 0 conv4_block5_out[0][0]
- conv4_block6_3_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block6_out (Activation) (None, 17, 17, 1024) 0 conv4_block6_add[0][0]
- __________________________________________________________________________________________________
- conv5_block1_1_conv (Conv2D) (None, 9, 9, 512) 524800 conv4_block6_out[0][0]
- __________________________________________________________________________________________________
- conv5_block1_1_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block1_1_conv[0][0]
- __________________________________________________________________________________________________
- conv5_block1_1_relu (Activation (None, 9, 9, 512) 0 conv5_block1_1_bn[0][0]
- __________________________________________________________________________________________________
- conv5_block1_2_conv (Conv2D) (None, 9, 9, 512) 2359808 conv5_block1_1_relu[0][0]
- __________________________________________________________________________________________________
- conv5_block1_2_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block1_2_conv[0][0]
- __________________________________________________________________________________________________
- conv5_block1_2_relu (Activation (None, 9, 9, 512) 0 conv5_block1_2_bn[0][0]
- __________________________________________________________________________________________________
- conv5_block1_0_conv (Conv2D) (None, 9, 9, 2048) 2099200 conv4_block6_out[0][0]
- __________________________________________________________________________________________________
- conv5_block1_3_conv (Conv2D) (None, 9, 9, 2048) 1050624 conv5_block1_2_relu[0][0]
- __________________________________________________________________________________________________
- conv5_block1_0_bn (BatchNormali (None, 9, 9, 2048) 8192 conv5_block1_0_conv[0][0]
- __________________________________________________________________________________________________
- conv5_block1_3_bn (BatchNormali (None, 9, 9, 2048) 8192 conv5_block1_3_conv[0][0]
- __________________________________________________________________________________________________
- conv5_block1_add (Add) (None, 9, 9, 2048) 0 conv5_block1_0_bn[0][0]
- conv5_block1_3_bn[0][0]
- __________________________________________________________________________________________________
- conv5_block1_out (Activation) (None, 9, 9, 2048) 0 conv5_block1_add[0][0]
- __________________________________________________________________________________________________
- conv5_block2_1_conv (Conv2D) (None, 9, 9, 512) 1049088 conv5_block1_out[0][0]
- __________________________________________________________________________________________________
- conv5_block2_1_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block2_1_conv[0][0]
- __________________________________________________________________________________________________
- conv5_block2_1_relu (Activation (None, 9, 9, 512) 0 conv5_block2_1_bn[0][0]
- __________________________________________________________________________________________________
- conv5_block2_2_conv (Conv2D) (None, 9, 9, 512) 2359808 conv5_block2_1_relu[0][0]
- __________________________________________________________________________________________________
- conv5_block2_2_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block2_2_conv[0][0]
- __________________________________________________________________________________________________
- conv5_block2_2_relu (Activation (None, 9, 9, 512) 0 conv5_block2_2_bn[0][0]
- __________________________________________________________________________________________________
- conv5_block2_3_conv (Conv2D) (None, 9, 9, 2048) 1050624 conv5_block2_2_relu[0][0]
- __________________________________________________________________________________________________
- conv5_block2_3_bn (BatchNormali (None, 9, 9, 2048) 8192 conv5_block2_3_conv[0][0]
- __________________________________________________________________________________________________
- conv5_block2_add (Add) (None, 9, 9, 2048) 0 conv5_block1_out[0][0]
- conv5_block2_3_bn[0][0]
- __________________________________________________________________________________________________
- conv5_block2_out (Activation) (None, 9, 9, 2048) 0 conv5_block2_add[0][0]
- __________________________________________________________________________________________________
- conv5_block3_1_conv (Conv2D) (None, 9, 9, 512) 1049088 conv5_block2_out[0][0]
- __________________________________________________________________________________________________
- conv5_block3_1_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block3_1_conv[0][0]
- __________________________________________________________________________________________________
- conv5_block3_1_relu (Activation (None, 9, 9, 512) 0 conv5_block3_1_bn[0][0]
- __________________________________________________________________________________________________
- conv5_block3_2_conv (Conv2D) (None, 9, 9, 512) 2359808 conv5_block3_1_relu[0][0]
- __________________________________________________________________________________________________
- conv5_block3_2_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block3_2_conv[0][0]
- __________________________________________________________________________________________________
- conv5_block3_2_relu (Activation (None, 9, 9, 512) 0 conv5_block3_2_bn[0][0]
- __________________________________________________________________________________________________
- conv5_block3_3_conv (Conv2D) (None, 9, 9, 2048) 1050624 conv5_block3_2_relu[0][0]
- __________________________________________________________________________________________________
- conv5_block3_3_bn (BatchNormali (None, 9, 9, 2048) 8192 conv5_block3_3_conv[0][0]
- __________________________________________________________________________________________________
- conv5_block3_add (Add) (None, 9, 9, 2048) 0 conv5_block2_out[0][0]
- conv5_block3_3_bn[0][0]
- __________________________________________________________________________________________________
- conv5_block3_out (Activation) (None, 9, 9, 2048) 0 conv5_block3_add[0][0]
- __________________________________________________________________________________________________
- avg_pool (GlobalAveragePooling2 (None, 2048) 0 conv5_block3_out[0][0]
- __________________________________________________________________________________________________
- probs (Dense) (None, 3) 6147 avg_pool[0][0]
- ==================================================================================================
- Total params: 23,587,587
- Trainable params: 23,534,467
- Non-trainable params: 53,120
- __________________________________________________________________________________________________
- Train for 563.0 steps, validate for 179.0 steps
- Epoch 1/10
- 2019-11-01 10:25:52.942924: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10.0
- 2019-11-01 10:25:54.882248: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7
- 563/563 [==============================] - 119s 211ms/step - loss: 1.0535 - accuracy: 0.5099 - val_loss: 1.0735 - val_accuracy: 0.6682
- Epoch 2/10
- 563/563 [==============================] - 95s 169ms/step - loss: 1.0123 - accuracy: 0.5277 - val_loss: 1.0721 - val_accuracy: 0.6682
- Epoch 3/10
- 563/563 [==============================] - 95s 169ms/step - loss: 1.0121 - accuracy: 0.5277 - val_loss: 1.0709 - val_accuracy: 0.6682
- Epoch 4/10
- 563/563 [==============================] - 95s 169ms/step - loss: 1.0124 - accuracy: 0.5277 - val_loss: 1.0667 - val_accuracy: 0.6682
- Epoch 5/10
- 563/563 [==============================] - 95s 169ms/step - loss: 1.0121 - accuracy: 0.5277 - val_loss: 1.0687 - val_accuracy: 0.6682
- Epoch 6/10
- 563/563 [==============================] - 95s 168ms/step - loss: 1.0125 - accuracy: 0.5277 - val_loss: 1.0638 - val_accuracy: 0.6682
- Epoch 7/10
- 563/563 [==============================] - 94s 167ms/step - loss: 1.0125 - accuracy: 0.5277 - val_loss: 1.0639 - val_accuracy: 0.6682
- Epoch 8/10
- 400/563 [====================>.........] - ETA: 24s - loss: 1.0135 - accuracy: 0.5268
Add Comment
Please, Sign In to add comment