Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- Output with '/gpu:0'
- ds_valid.shuffle(len(list_paths_train), seed=42,
- reshuffle_each_iteration=True)
- 2019-11-01 09:53:21.406893: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1
- 2019-11-01 09:53:21.450823: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1546] Found device 0 with properties:
- pciBusID: 0000:86:00.0 name: GeForce RTX 2080 Ti computeCapability: 7.5
- coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s
- 2019-11-01 09:53:21.451467: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1546] Found device 1 with properties:
- pciBusID: 0000:af:00.0 name: GeForce RTX 2080 Ti computeCapability: 7.5
- coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s
- 2019-11-01 09:53:21.451755: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.0
- 2019-11-01 09:53:21.453373: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10.0
- 2019-11-01 09:53:21.454649: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10.0
- 2019-11-01 09:53:21.455396: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10.0
- 2019-11-01 09:53:21.456894: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10.0
- 2019-11-01 09:53:21.458026: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10.0
- 2019-11-01 09:53:21.461598: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7
- 2019-11-01 09:53:21.463936: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1688] Adding visible gpu devices: 0, 1
- 2019-11-01 09:53:21.464311: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 AVX512F FMA
- 2019-11-01 09:53:21.498095: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 2100000000 Hz
- 2019-11-01 09:53:21.508899: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x4995c00 initialized for platform Host (this does not guarantee that XLA will be used). Devices:
- 2019-11-01 09:53:21.508954: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version
- 2019-11-01 09:53:21.772746: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x4986d40 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
- 2019-11-01 09:53:21.772787: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): GeForce RTX 2080 Ti, Compute Capability 7.5
- 2019-11-01 09:53:21.772797: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (1): GeForce RTX 2080 Ti, Compute Capability 7.5
- 2019-11-01 09:53:21.774090: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1546] Found device 0 with properties:
- pciBusID: 0000:86:00.0 name: GeForce RTX 2080 Ti computeCapability: 7.5
- coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s
- 2019-11-01 09:53:21.774964: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1546] Found device 1 with properties:
- pciBusID: 0000:af:00.0 name: GeForce RTX 2080 Ti computeCapability: 7.5
- coreClock: 1.545GHz coreCount: 68 deviceMemorySize: 10.76GiB deviceMemoryBandwidth: 573.69GiB/s
- 2019-11-01 09:53:21.775025: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.0
- 2019-11-01 09:53:21.775048: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10.0
- 2019-11-01 09:53:21.775067: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10.0
- 2019-11-01 09:53:21.775087: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10.0
- 2019-11-01 09:53:21.775107: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10.0
- 2019-11-01 09:53:21.775126: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10.0
- 2019-11-01 09:53:21.775147: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7
- 2019-11-01 09:53:21.778445: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1688] Adding visible gpu devices: 0, 1
- 2019-11-01 09:53:21.778500: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.0
- 2019-11-01 09:53:21.781150: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1087] Device interconnect StreamExecutor with strength 1 edge matrix:
- 2019-11-01 09:53:21.781172: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1093] 0 1
- 2019-11-01 09:53:21.781184: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1106] 0: N N
- 2019-11-01 09:53:21.781195: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1106] 1: N N
- 2019-11-01 09:53:21.783844: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1232] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 10312 MB memory) -> physical GPU (device: 0, name: GeForce RTX 2080 Ti, pci bus id: 0000:86:00.0, compute capability: 7.5)
- 2019-11-01 09:53:21.785587: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1232] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:1 with 10312 MB memory) -> physical GPU (device: 1, name: GeForce RTX 2080 Ti, pci bus id: 0000:af:00.0, compute capability: 7.5)
- Model: "resnet50"
- __________________________________________________________________________________________________
- Layer (type) Output Shape Param # Connected to
- ==================================================================================================
- input_1 (InputLayer) [(None, 270, 270, 1) 0
- __________________________________________________________________________________________________
- conv1_pad (ZeroPadding2D) (None, 276, 276, 1) 0 input_1[0][0]
- __________________________________________________________________________________________________
- conv1_conv (Conv2D) (None, 135, 135, 64) 3200 conv1_pad[0][0]
- __________________________________________________________________________________________________
- conv1_bn (BatchNormalization) (None, 135, 135, 64) 256 conv1_conv[0][0]
- __________________________________________________________________________________________________
- conv1_relu (Activation) (None, 135, 135, 64) 0 conv1_bn[0][0]
- __________________________________________________________________________________________________
- pool1_pad (ZeroPadding2D) (None, 137, 137, 64) 0 conv1_relu[0][0]
- __________________________________________________________________________________________________
- pool1_pool (MaxPooling2D) (None, 68, 68, 64) 0 pool1_pad[0][0]
- __________________________________________________________________________________________________
- conv2_block1_1_conv (Conv2D) (None, 68, 68, 64) 4160 pool1_pool[0][0]
- __________________________________________________________________________________________________
- conv2_block1_1_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block1_1_conv[0][0]
- __________________________________________________________________________________________________
- conv2_block1_1_relu (Activation (None, 68, 68, 64) 0 conv2_block1_1_bn[0][0]
- __________________________________________________________________________________________________
- conv2_block1_2_conv (Conv2D) (None, 68, 68, 64) 36928 conv2_block1_1_relu[0][0]
- __________________________________________________________________________________________________
- conv2_block1_2_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block1_2_conv[0][0]
- __________________________________________________________________________________________________
- conv2_block1_2_relu (Activation (None, 68, 68, 64) 0 conv2_block1_2_bn[0][0]
- __________________________________________________________________________________________________
- conv2_block1_0_conv (Conv2D) (None, 68, 68, 256) 16640 pool1_pool[0][0]
- __________________________________________________________________________________________________
- conv2_block1_3_conv (Conv2D) (None, 68, 68, 256) 16640 conv2_block1_2_relu[0][0]
- __________________________________________________________________________________________________
- conv2_block1_0_bn (BatchNormali (None, 68, 68, 256) 1024 conv2_block1_0_conv[0][0]
- __________________________________________________________________________________________________
- conv2_block1_3_bn (BatchNormali (None, 68, 68, 256) 1024 conv2_block1_3_conv[0][0]
- __________________________________________________________________________________________________
- conv2_block1_add (Add) (None, 68, 68, 256) 0 conv2_block1_0_bn[0][0]
- conv2_block1_3_bn[0][0]
- __________________________________________________________________________________________________
- conv2_block1_out (Activation) (None, 68, 68, 256) 0 conv2_block1_add[0][0]
- __________________________________________________________________________________________________
- conv2_block2_1_conv (Conv2D) (None, 68, 68, 64) 16448 conv2_block1_out[0][0]
- __________________________________________________________________________________________________
- conv2_block2_1_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block2_1_conv[0][0]
- __________________________________________________________________________________________________
- conv2_block2_1_relu (Activation (None, 68, 68, 64) 0 conv2_block2_1_bn[0][0]
- __________________________________________________________________________________________________
- conv2_block2_2_conv (Conv2D) (None, 68, 68, 64) 36928 conv2_block2_1_relu[0][0]
- __________________________________________________________________________________________________
- conv2_block2_2_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block2_2_conv[0][0]
- __________________________________________________________________________________________________
- conv2_block2_2_relu (Activation (None, 68, 68, 64) 0 conv2_block2_2_bn[0][0]
- __________________________________________________________________________________________________
- conv2_block2_3_conv (Conv2D) (None, 68, 68, 256) 16640 conv2_block2_2_relu[0][0]
- __________________________________________________________________________________________________
- conv2_block2_3_bn (BatchNormali (None, 68, 68, 256) 1024 conv2_block2_3_conv[0][0]
- __________________________________________________________________________________________________
- conv2_block2_add (Add) (None, 68, 68, 256) 0 conv2_block1_out[0][0]
- conv2_block2_3_bn[0][0]
- __________________________________________________________________________________________________
- conv2_block2_out (Activation) (None, 68, 68, 256) 0 conv2_block2_add[0][0]
- __________________________________________________________________________________________________
- conv2_block3_1_conv (Conv2D) (None, 68, 68, 64) 16448 conv2_block2_out[0][0]
- __________________________________________________________________________________________________
- conv2_block3_1_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block3_1_conv[0][0]
- __________________________________________________________________________________________________
- conv2_block3_1_relu (Activation (None, 68, 68, 64) 0 conv2_block3_1_bn[0][0]
- __________________________________________________________________________________________________
- conv2_block3_2_conv (Conv2D) (None, 68, 68, 64) 36928 conv2_block3_1_relu[0][0]
- __________________________________________________________________________________________________
- conv2_block3_2_bn (BatchNormali (None, 68, 68, 64) 256 conv2_block3_2_conv[0][0]
- __________________________________________________________________________________________________
- conv2_block3_2_relu (Activation (None, 68, 68, 64) 0 conv2_block3_2_bn[0][0]
- __________________________________________________________________________________________________
- conv2_block3_3_conv (Conv2D) (None, 68, 68, 256) 16640 conv2_block3_2_relu[0][0]
- __________________________________________________________________________________________________
- conv2_block3_3_bn (BatchNormali (None, 68, 68, 256) 1024 conv2_block3_3_conv[0][0]
- __________________________________________________________________________________________________
- conv2_block3_add (Add) (None, 68, 68, 256) 0 conv2_block2_out[0][0]
- conv2_block3_3_bn[0][0]
- __________________________________________________________________________________________________
- conv2_block3_out (Activation) (None, 68, 68, 256) 0 conv2_block3_add[0][0]
- __________________________________________________________________________________________________
- conv3_block1_1_conv (Conv2D) (None, 34, 34, 128) 32896 conv2_block3_out[0][0]
- __________________________________________________________________________________________________
- conv3_block1_1_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block1_1_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block1_1_relu (Activation (None, 34, 34, 128) 0 conv3_block1_1_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block1_2_conv (Conv2D) (None, 34, 34, 128) 147584 conv3_block1_1_relu[0][0]
- __________________________________________________________________________________________________
- conv3_block1_2_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block1_2_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block1_2_relu (Activation (None, 34, 34, 128) 0 conv3_block1_2_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block1_0_conv (Conv2D) (None, 34, 34, 512) 131584 conv2_block3_out[0][0]
- __________________________________________________________________________________________________
- conv3_block1_3_conv (Conv2D) (None, 34, 34, 512) 66048 conv3_block1_2_relu[0][0]
- __________________________________________________________________________________________________
- conv3_block1_0_bn (BatchNormali (None, 34, 34, 512) 2048 conv3_block1_0_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block1_3_bn (BatchNormali (None, 34, 34, 512) 2048 conv3_block1_3_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block1_add (Add) (None, 34, 34, 512) 0 conv3_block1_0_bn[0][0]
- conv3_block1_3_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block1_out (Activation) (None, 34, 34, 512) 0 conv3_block1_add[0][0]
- __________________________________________________________________________________________________
- conv3_block2_1_conv (Conv2D) (None, 34, 34, 128) 65664 conv3_block1_out[0][0]
- __________________________________________________________________________________________________
- conv3_block2_1_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block2_1_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block2_1_relu (Activation (None, 34, 34, 128) 0 conv3_block2_1_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block2_2_conv (Conv2D) (None, 34, 34, 128) 147584 conv3_block2_1_relu[0][0]
- __________________________________________________________________________________________________
- conv3_block2_2_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block2_2_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block2_2_relu (Activation (None, 34, 34, 128) 0 conv3_block2_2_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block2_3_conv (Conv2D) (None, 34, 34, 512) 66048 conv3_block2_2_relu[0][0]
- __________________________________________________________________________________________________
- conv3_block2_3_bn (BatchNormali (None, 34, 34, 512) 2048 conv3_block2_3_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block2_add (Add) (None, 34, 34, 512) 0 conv3_block1_out[0][0]
- conv3_block2_3_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block2_out (Activation) (None, 34, 34, 512) 0 conv3_block2_add[0][0]
- __________________________________________________________________________________________________
- conv3_block3_1_conv (Conv2D) (None, 34, 34, 128) 65664 conv3_block2_out[0][0]
- __________________________________________________________________________________________________
- conv3_block3_1_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block3_1_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block3_1_relu (Activation (None, 34, 34, 128) 0 conv3_block3_1_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block3_2_conv (Conv2D) (None, 34, 34, 128) 147584 conv3_block3_1_relu[0][0]
- __________________________________________________________________________________________________
- conv3_block3_2_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block3_2_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block3_2_relu (Activation (None, 34, 34, 128) 0 conv3_block3_2_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block3_3_conv (Conv2D) (None, 34, 34, 512) 66048 conv3_block3_2_relu[0][0]
- __________________________________________________________________________________________________
- conv3_block3_3_bn (BatchNormali (None, 34, 34, 512) 2048 conv3_block3_3_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block3_add (Add) (None, 34, 34, 512) 0 conv3_block2_out[0][0]
- conv3_block3_3_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block3_out (Activation) (None, 34, 34, 512) 0 conv3_block3_add[0][0]
- __________________________________________________________________________________________________
- conv3_block4_1_conv (Conv2D) (None, 34, 34, 128) 65664 conv3_block3_out[0][0]
- __________________________________________________________________________________________________
- conv3_block4_1_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block4_1_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block4_1_relu (Activation (None, 34, 34, 128) 0 conv3_block4_1_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block4_2_conv (Conv2D) (None, 34, 34, 128) 147584 conv3_block4_1_relu[0][0]
- __________________________________________________________________________________________________
- conv3_block4_2_bn (BatchNormali (None, 34, 34, 128) 512 conv3_block4_2_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block4_2_relu (Activation (None, 34, 34, 128) 0 conv3_block4_2_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block4_3_conv (Conv2D) (None, 34, 34, 512) 66048 conv3_block4_2_relu[0][0]
- __________________________________________________________________________________________________
- conv3_block4_3_bn (BatchNormali (None, 34, 34, 512) 2048 conv3_block4_3_conv[0][0]
- __________________________________________________________________________________________________
- conv3_block4_add (Add) (None, 34, 34, 512) 0 conv3_block3_out[0][0]
- conv3_block4_3_bn[0][0]
- __________________________________________________________________________________________________
- conv3_block4_out (Activation) (None, 34, 34, 512) 0 conv3_block4_add[0][0]
- __________________________________________________________________________________________________
- conv4_block1_1_conv (Conv2D) (None, 17, 17, 256) 131328 conv3_block4_out[0][0]
- __________________________________________________________________________________________________
- conv4_block1_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block1_1_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block1_1_relu (Activation (None, 17, 17, 256) 0 conv4_block1_1_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block1_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block1_1_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block1_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block1_2_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block1_2_relu (Activation (None, 17, 17, 256) 0 conv4_block1_2_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block1_0_conv (Conv2D) (None, 17, 17, 1024) 525312 conv3_block4_out[0][0]
- __________________________________________________________________________________________________
- conv4_block1_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block1_2_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block1_0_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block1_0_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block1_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block1_3_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block1_add (Add) (None, 17, 17, 1024) 0 conv4_block1_0_bn[0][0]
- conv4_block1_3_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block1_out (Activation) (None, 17, 17, 1024) 0 conv4_block1_add[0][0]
- __________________________________________________________________________________________________
- conv4_block2_1_conv (Conv2D) (None, 17, 17, 256) 262400 conv4_block1_out[0][0]
- __________________________________________________________________________________________________
- conv4_block2_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block2_1_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block2_1_relu (Activation (None, 17, 17, 256) 0 conv4_block2_1_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block2_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block2_1_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block2_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block2_2_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block2_2_relu (Activation (None, 17, 17, 256) 0 conv4_block2_2_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block2_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block2_2_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block2_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block2_3_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block2_add (Add) (None, 17, 17, 1024) 0 conv4_block1_out[0][0]
- conv4_block2_3_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block2_out (Activation) (None, 17, 17, 1024) 0 conv4_block2_add[0][0]
- __________________________________________________________________________________________________
- conv4_block3_1_conv (Conv2D) (None, 17, 17, 256) 262400 conv4_block2_out[0][0]
- __________________________________________________________________________________________________
- conv4_block3_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block3_1_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block3_1_relu (Activation (None, 17, 17, 256) 0 conv4_block3_1_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block3_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block3_1_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block3_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block3_2_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block3_2_relu (Activation (None, 17, 17, 256) 0 conv4_block3_2_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block3_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block3_2_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block3_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block3_3_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block3_add (Add) (None, 17, 17, 1024) 0 conv4_block2_out[0][0]
- conv4_block3_3_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block3_out (Activation) (None, 17, 17, 1024) 0 conv4_block3_add[0][0]
- __________________________________________________________________________________________________
- conv4_block4_1_conv (Conv2D) (None, 17, 17, 256) 262400 conv4_block3_out[0][0]
- __________________________________________________________________________________________________
- conv4_block4_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block4_1_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block4_1_relu (Activation (None, 17, 17, 256) 0 conv4_block4_1_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block4_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block4_1_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block4_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block4_2_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block4_2_relu (Activation (None, 17, 17, 256) 0 conv4_block4_2_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block4_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block4_2_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block4_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block4_3_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block4_add (Add) (None, 17, 17, 1024) 0 conv4_block3_out[0][0]
- conv4_block4_3_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block4_out (Activation) (None, 17, 17, 1024) 0 conv4_block4_add[0][0]
- __________________________________________________________________________________________________
- conv4_block5_1_conv (Conv2D) (None, 17, 17, 256) 262400 conv4_block4_out[0][0]
- __________________________________________________________________________________________________
- conv4_block5_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block5_1_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block5_1_relu (Activation (None, 17, 17, 256) 0 conv4_block5_1_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block5_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block5_1_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block5_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block5_2_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block5_2_relu (Activation (None, 17, 17, 256) 0 conv4_block5_2_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block5_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block5_2_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block5_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block5_3_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block5_add (Add) (None, 17, 17, 1024) 0 conv4_block4_out[0][0]
- conv4_block5_3_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block5_out (Activation) (None, 17, 17, 1024) 0 conv4_block5_add[0][0]
- __________________________________________________________________________________________________
- conv4_block6_1_conv (Conv2D) (None, 17, 17, 256) 262400 conv4_block5_out[0][0]
- __________________________________________________________________________________________________
- conv4_block6_1_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block6_1_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block6_1_relu (Activation (None, 17, 17, 256) 0 conv4_block6_1_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block6_2_conv (Conv2D) (None, 17, 17, 256) 590080 conv4_block6_1_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block6_2_bn (BatchNormali (None, 17, 17, 256) 1024 conv4_block6_2_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block6_2_relu (Activation (None, 17, 17, 256) 0 conv4_block6_2_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block6_3_conv (Conv2D) (None, 17, 17, 1024) 263168 conv4_block6_2_relu[0][0]
- __________________________________________________________________________________________________
- conv4_block6_3_bn (BatchNormali (None, 17, 17, 1024) 4096 conv4_block6_3_conv[0][0]
- __________________________________________________________________________________________________
- conv4_block6_add (Add) (None, 17, 17, 1024) 0 conv4_block5_out[0][0]
- conv4_block6_3_bn[0][0]
- __________________________________________________________________________________________________
- conv4_block6_out (Activation) (None, 17, 17, 1024) 0 conv4_block6_add[0][0]
- __________________________________________________________________________________________________
- conv5_block1_1_conv (Conv2D) (None, 9, 9, 512) 524800 conv4_block6_out[0][0]
- __________________________________________________________________________________________________
- conv5_block1_1_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block1_1_conv[0][0]
- __________________________________________________________________________________________________
- conv5_block1_1_relu (Activation (None, 9, 9, 512) 0 conv5_block1_1_bn[0][0]
- __________________________________________________________________________________________________
- conv5_block1_2_conv (Conv2D) (None, 9, 9, 512) 2359808 conv5_block1_1_relu[0][0]
- __________________________________________________________________________________________________
- conv5_block1_2_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block1_2_conv[0][0]
- __________________________________________________________________________________________________
- conv5_block1_2_relu (Activation (None, 9, 9, 512) 0 conv5_block1_2_bn[0][0]
- __________________________________________________________________________________________________
- conv5_block1_0_conv (Conv2D) (None, 9, 9, 2048) 2099200 conv4_block6_out[0][0]
- __________________________________________________________________________________________________
- conv5_block1_3_conv (Conv2D) (None, 9, 9, 2048) 1050624 conv5_block1_2_relu[0][0]
- __________________________________________________________________________________________________
- conv5_block1_0_bn (BatchNormali (None, 9, 9, 2048) 8192 conv5_block1_0_conv[0][0]
- __________________________________________________________________________________________________
- conv5_block1_3_bn (BatchNormali (None, 9, 9, 2048) 8192 conv5_block1_3_conv[0][0]
- __________________________________________________________________________________________________
- conv5_block1_add (Add) (None, 9, 9, 2048) 0 conv5_block1_0_bn[0][0]
- conv5_block1_3_bn[0][0]
- __________________________________________________________________________________________________
- conv5_block1_out (Activation) (None, 9, 9, 2048) 0 conv5_block1_add[0][0]
- __________________________________________________________________________________________________
- conv5_block2_1_conv (Conv2D) (None, 9, 9, 512) 1049088 conv5_block1_out[0][0]
- __________________________________________________________________________________________________
- conv5_block2_1_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block2_1_conv[0][0]
- __________________________________________________________________________________________________
- conv5_block2_1_relu (Activation (None, 9, 9, 512) 0 conv5_block2_1_bn[0][0]
- __________________________________________________________________________________________________
- conv5_block2_2_conv (Conv2D) (None, 9, 9, 512) 2359808 conv5_block2_1_relu[0][0]
- __________________________________________________________________________________________________
- conv5_block2_2_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block2_2_conv[0][0]
- __________________________________________________________________________________________________
- conv5_block2_2_relu (Activation (None, 9, 9, 512) 0 conv5_block2_2_bn[0][0]
- __________________________________________________________________________________________________
- conv5_block2_3_conv (Conv2D) (None, 9, 9, 2048) 1050624 conv5_block2_2_relu[0][0]
- __________________________________________________________________________________________________
- conv5_block2_3_bn (BatchNormali (None, 9, 9, 2048) 8192 conv5_block2_3_conv[0][0]
- __________________________________________________________________________________________________
- conv5_block2_add (Add) (None, 9, 9, 2048) 0 conv5_block1_out[0][0]
- conv5_block2_3_bn[0][0]
- __________________________________________________________________________________________________
- conv5_block2_out (Activation) (None, 9, 9, 2048) 0 conv5_block2_add[0][0]
- __________________________________________________________________________________________________
- conv5_block3_1_conv (Conv2D) (None, 9, 9, 512) 1049088 conv5_block2_out[0][0]
- __________________________________________________________________________________________________
- conv5_block3_1_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block3_1_conv[0][0]
- __________________________________________________________________________________________________
- conv5_block3_1_relu (Activation (None, 9, 9, 512) 0 conv5_block3_1_bn[0][0]
- __________________________________________________________________________________________________
- conv5_block3_2_conv (Conv2D) (None, 9, 9, 512) 2359808 conv5_block3_1_relu[0][0]
- __________________________________________________________________________________________________
- conv5_block3_2_bn (BatchNormali (None, 9, 9, 512) 2048 conv5_block3_2_conv[0][0]
- __________________________________________________________________________________________________
- conv5_block3_2_relu (Activation (None, 9, 9, 512) 0 conv5_block3_2_bn[0][0]
- __________________________________________________________________________________________________
- conv5_block3_3_conv (Conv2D) (None, 9, 9, 2048) 1050624 conv5_block3_2_relu[0][0]
- __________________________________________________________________________________________________
- conv5_block3_3_bn (BatchNormali (None, 9, 9, 2048) 8192 conv5_block3_3_conv[0][0]
- __________________________________________________________________________________________________
- conv5_block3_add (Add) (None, 9, 9, 2048) 0 conv5_block2_out[0][0]
- conv5_block3_3_bn[0][0]
- __________________________________________________________________________________________________
- conv5_block3_out (Activation) (None, 9, 9, 2048) 0 conv5_block3_add[0][0]
- __________________________________________________________________________________________________
- avg_pool (GlobalAveragePooling2 (None, 2048) 0 conv5_block3_out[0][0]
- __________________________________________________________________________________________________
- probs (Dense) (None, 3) 6147 avg_pool[0][0]
- ==================================================================================================
- Total params: 23,587,587
- Trainable params: 23,534,467
- Non-trainable params: 53,120
- __________________________________________________________________________________________________
- Train for 563.0 steps, validate for 179.0 steps
- Epoch 1/10
- 2019-11-01 09:53:28.832570: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10.0
- 2019-11-01 09:53:30.675227: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7
- 563/563 [==============================] - 141s 250ms/step - loss: 0.1185 - accuracy: 0.9616 - val_loss: 0.5751 - val_accuracy: 0.8078
- Epoch 2/10
- 563/563 [==============================] - 130s 231ms/step - loss: 0.0400 - accuracy: 0.9865 - val_loss: 0.8953 - val_accuracy: 0.7119
- Epoch 3/10
- 563/563 [==============================] - 130s 231ms/step - loss: 0.0478 - accuracy: 0.9870 - val_loss: 25.3537 - val_accuracy: 0.3367
- Epoch 4/10
- 563/563 [==============================] - 130s 230ms/step - loss: 0.0309 - accuracy: 0.9906 - val_loss: 0.0576 - val_accuracy: 0.9946
- Epoch 5/10
- 563/563 [==============================] - 129s 230ms/step - loss: 0.0210 - accuracy: 0.9940 - val_loss: 0.0780 - val_accuracy: 0.9916
- Epoch 6/10
- 563/563 [==============================] - 130s 230ms/step - loss: 0.0227 - accuracy: 0.9937 - val_loss: 0.0595 - val_accuracy: 0.9887
- Epoch 7/10
- 563/563 [==============================] - 129s 230ms/step - loss: 0.0160 - accuracy: 0.9949 - val_loss: 0.0536 - val_accuracy: 0.9946
- Epoch 8/10
- 81/563 [===>..........................] - ETA: 1:39 - loss: 0.0222 - accuracy: 0.9945
Add Comment
Please, Sign In to add comment