Guest User

noahchalifour rnnt-speech-recognition on github

a guest
Jun 23rd, 2020
30
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 112.26 KB | None | 0 0
  1. 2020-06-02 12:39:28.156861: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1
  2. 2020-06-02 12:39:28.176469: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
  3. 2020-06-02 12:39:28.177370: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties:
  4. pciBusID: 0000:00:04.0 name: Tesla K80 computeCapability: 3.7
  5. coreClock: 0.8235GHz coreCount: 13 deviceMemorySize: 11.17GiB deviceMemoryBandwidth: 223.96GiB/s
  6. 2020-06-02 12:39:28.177800: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1
  7. 2020-06-02 12:39:28.180419: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10
  8. 2020-06-02 12:39:28.183504: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10
  9. 2020-06-02 12:39:28.184148: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10
  10. 2020-06-02 12:39:28.188244: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10
  11. 2020-06-02 12:39:28.189294: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10
  12. 2020-06-02 12:39:28.193879: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7
  13. 2020-06-02 12:39:28.194088: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
  14. 2020-06-02 12:39:28.194941: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
  15. 2020-06-02 12:39:28.195634: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0
  16. 2020-06-02 12:39:28.196105: I tensorflow/core/platform/cpu_feature_guard.cc:143] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
  17. 2020-06-02 12:39:28.202030: I tensorflow/core/platform/profile_utils/cpu_utils.cc:102] CPU Frequency: 2300000000 Hz
  18. 2020-06-02 12:39:28.202264: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x4e9b180 initialized for platform Host (this does not guarantee that XLA will be used). Devices:
  19. 2020-06-02 12:39:28.202301: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version
  20. 2020-06-02 12:39:28.256653: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
  21. 2020-06-02 12:39:28.257522: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x4e9aa80 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
  22. 2020-06-02 12:39:28.257556: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Tesla K80, Compute Capability 3.7
  23. 2020-06-02 12:39:28.257877: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
  24. 2020-06-02 12:39:28.258577: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1561] Found device 0 with properties:
  25. pciBusID: 0000:00:04.0 name: Tesla K80 computeCapability: 3.7
  26. coreClock: 0.8235GHz coreCount: 13 deviceMemorySize: 11.17GiB deviceMemoryBandwidth: 223.96GiB/s
  27. 2020-06-02 12:39:28.258663: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1
  28. 2020-06-02 12:39:28.258730: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10
  29. 2020-06-02 12:39:28.258789: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10
  30. 2020-06-02 12:39:28.258832: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10
  31. 2020-06-02 12:39:28.258880: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10
  32. 2020-06-02 12:39:28.258927: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10
  33. 2020-06-02 12:39:28.258995: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7
  34. 2020-06-02 12:39:28.259152: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
  35. 2020-06-02 12:39:28.259954: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
  36. 2020-06-02 12:39:28.260733: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1703] Adding visible gpu devices: 0
  37. 2020-06-02 12:39:28.260802: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1
  38. 2020-06-02 12:39:28.262587: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] Device interconnect StreamExecutor with strength 1 edge matrix:
  39. 2020-06-02 12:39:28.262621: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1108] 0
  40. 2020-06-02 12:39:28.262640: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1121] 0: N
  41. 2020-06-02 12:39:28.262856: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
  42. 2020-06-02 12:39:28.263638: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
  43. 2020-06-02 12:39:28.264457: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 10691 MB memory) -> physical GPU (device: 0, name: Tesla K80, pci bus id: 0000:00:04.0, compute capability: 3.7)
  44. 1 Physical GPU, 1 Logical GPUs
  45. Running single gpu: GPU:0
  46. WARNING:tensorflow:From /content/rnnt-speech-recognition/model.py:60: LSTMCell.__init__ (from tensorflow.python.ops.rnn_cell_impl) is deprecated and will be removed in a future version.
  47. Instructions for updating:
  48. This class is equivalent as tf.keras.layers.LSTMCell, and will be replaced by that in Tensorflow 2.0.
  49. W0602 12:39:28.851906 140379808331648 deprecation.py:323] From /content/rnnt-speech-recognition/model.py:60: LSTMCell.__init__ (from tensorflow.python.ops.rnn_cell_impl) is deprecated and will be removed in a future version.
  50. Instructions for updating:
  51. This class is equivalent as tf.keras.layers.LSTMCell, and will be replaced by that in Tensorflow 2.0.
  52. WARNING:tensorflow:<tensorflow.python.ops.rnn_cell_impl.LSTMCell object at 0x7fac32f4b5f8>: Note that this cell is not optimized for performance. Please use tf.contrib.cudnn_rnn.CudnnLSTM for better performance on GPU.
  53. W0602 12:39:28.852545 140379808331648 rnn_cell_impl.py:912] <tensorflow.python.ops.rnn_cell_impl.LSTMCell object at 0x7fac32f4b5f8>: Note that this cell is not optimized for performance. Please use tf.contrib.cudnn_rnn.CudnnLSTM for better performance on GPU.
  54. WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/rnn_cell_impl.py:966: Layer.add_variable (from tensorflow.python.keras.engine.base_layer) is deprecated and will be removed in a future version.
  55. Instructions for updating:
  56. Please use `layer.add_weight` method instead.
  57. W0602 12:39:28.856940 140379808331648 deprecation.py:323] From /usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/rnn_cell_impl.py:966: Layer.add_variable (from tensorflow.python.keras.engine.base_layer) is deprecated and will be removed in a future version.
  58. Instructions for updating:
  59. Please use `layer.add_weight` method instead.
  60. WARNING:tensorflow:<tensorflow.python.ops.rnn_cell_impl.LSTMCell object at 0x7fabdc33bda0>: Note that this cell is not optimized for performance. Please use tf.contrib.cudnn_rnn.CudnnLSTM for better performance on GPU.
  61. W0602 12:39:29.014558 140379808331648 rnn_cell_impl.py:912] <tensorflow.python.ops.rnn_cell_impl.LSTMCell object at 0x7fabdc33bda0>: Note that this cell is not optimized for performance. Please use tf.contrib.cudnn_rnn.CudnnLSTM for better performance on GPU.
  62. WARNING:tensorflow:<tensorflow.python.ops.rnn_cell_impl.LSTMCell object at 0x7fabdc29f710>: Note that this cell is not optimized for performance. Please use tf.contrib.cudnn_rnn.CudnnLSTM for better performance on GPU.
  63. W0602 12:39:29.333060 140379808331648 rnn_cell_impl.py:912] <tensorflow.python.ops.rnn_cell_impl.LSTMCell object at 0x7fabdc29f710>: Note that this cell is not optimized for performance. Please use tf.contrib.cudnn_rnn.CudnnLSTM for better performance on GPU.
  64. WARNING:tensorflow:<tensorflow.python.ops.rnn_cell_impl.LSTMCell object at 0x7fabdc25c080>: Note that this cell is not optimized for performance. Please use tf.contrib.cudnn_rnn.CudnnLSTM for better performance on GPU.
  65. W0602 12:39:29.483510 140379808331648 rnn_cell_impl.py:912] <tensorflow.python.ops.rnn_cell_impl.LSTMCell object at 0x7fabdc25c080>: Note that this cell is not optimized for performance. Please use tf.contrib.cudnn_rnn.CudnnLSTM for better performance on GPU.
  66. WARNING:tensorflow:<tensorflow.python.ops.rnn_cell_impl.LSTMCell object at 0x7fabdc216c18>: Note that this cell is not optimized for performance. Please use tf.contrib.cudnn_rnn.CudnnLSTM for better performance on GPU.
  67. W0602 12:39:29.619056 140379808331648 rnn_cell_impl.py:912] <tensorflow.python.ops.rnn_cell_impl.LSTMCell object at 0x7fabdc216c18>: Note that this cell is not optimized for performance. Please use tf.contrib.cudnn_rnn.CudnnLSTM for better performance on GPU.
  68. WARNING:tensorflow:<tensorflow.python.ops.rnn_cell_impl.LSTMCell object at 0x7fabdc182828>: Note that this cell is not optimized for performance. Please use tf.contrib.cudnn_rnn.CudnnLSTM for better performance on GPU.
  69. W0602 12:39:29.760186 140379808331648 rnn_cell_impl.py:912] <tensorflow.python.ops.rnn_cell_impl.LSTMCell object at 0x7fabdc182828>: Note that this cell is not optimized for performance. Please use tf.contrib.cudnn_rnn.CudnnLSTM for better performance on GPU.
  70. WARNING:tensorflow:<tensorflow.python.ops.rnn_cell_impl.LSTMCell object at 0x7fabdc0e3ac8>: Note that this cell is not optimized for performance. Please use tf.contrib.cudnn_rnn.CudnnLSTM for better performance on GPU.
  71. W0602 12:39:29.901578 140379808331648 rnn_cell_impl.py:912] <tensorflow.python.ops.rnn_cell_impl.LSTMCell object at 0x7fabdc0e3ac8>: Note that this cell is not optimized for performance. Please use tf.contrib.cudnn_rnn.CudnnLSTM for better performance on GPU.
  72. WARNING:tensorflow:<tensorflow.python.ops.rnn_cell_impl.LSTMCell object at 0x7fabdc0c2e10>: Note that this cell is not optimized for performance. Please use tf.contrib.cudnn_rnn.CudnnLSTM for better performance on GPU.
  73. W0602 12:39:30.041028 140379808331648 rnn_cell_impl.py:912] <tensorflow.python.ops.rnn_cell_impl.LSTMCell object at 0x7fabdc0c2e10>: Note that this cell is not optimized for performance. Please use tf.contrib.cudnn_rnn.CudnnLSTM for better performance on GPU.
  74. WARNING:tensorflow:<tensorflow.python.ops.rnn_cell_impl.LSTMCell object at 0x7fabdbd40550>: Note that this cell is not optimized for performance. Please use tf.contrib.cudnn_rnn.CudnnLSTM for better performance on GPU.
  75. W0602 12:39:31.483096 140379808331648 rnn_cell_impl.py:912] <tensorflow.python.ops.rnn_cell_impl.LSTMCell object at 0x7fabdbd40550>: Note that this cell is not optimized for performance. Please use tf.contrib.cudnn_rnn.CudnnLSTM for better performance on GPU.
  76. WARNING:tensorflow:<tensorflow.python.ops.rnn_cell_impl.LSTMCell object at 0x7fabdbd15e80>: Note that this cell is not optimized for performance. Please use tf.contrib.cudnn_rnn.CudnnLSTM for better performance on GPU.
  77. W0602 12:39:31.637485 140379808331648 rnn_cell_impl.py:912] <tensorflow.python.ops.rnn_cell_impl.LSTMCell object at 0x7fabdbd15e80>: Note that this cell is not optimized for performance. Please use tf.contrib.cudnn_rnn.CudnnLSTM for better performance on GPU.
  78. I0602 12:39:32.106459 140379808331648 run_rnnt.py:491] Using word-piece encoder with vocab size: 4361
  79. Model: "encoder"
  80. _________________________________________________________________
  81. Layer (type) Output Shape Param #
  82. =================================================================
  83. input_1 (InputLayer) [(None, None, 240)] 0
  84. _________________________________________________________________
  85. batch_normalization (BatchNo (None, None, 240) 960
  86. _________________________________________________________________
  87. rnn (RNN) (None, None, 640) 8527872
  88. _________________________________________________________________
  89. dropout (Dropout) (None, None, 640) 0
  90. _________________________________________________________________
  91. layer_normalization (LayerNo (None, None, 640) 1280
  92. _________________________________________________________________
  93. rnn_1 (RNN) (None, None, 640) 11804672
  94. _________________________________________________________________
  95. dropout_1 (Dropout) (None, None, 640) 0
  96. _________________________________________________________________
  97. layer_normalization_1 (Layer (None, None, 640) 1280
  98. _________________________________________________________________
  99. time_reduction (TimeReductio (None, None, 1280) 0
  100. _________________________________________________________________
  101. rnn_2 (RNN) (None, None, 640) 17047552
  102. _________________________________________________________________
  103. dropout_2 (Dropout) (None, None, 640) 0
  104. _________________________________________________________________
  105. layer_normalization_2 (Layer (None, None, 640) 1280
  106. _________________________________________________________________
  107. rnn_3 (RNN) (None, None, 640) 11804672
  108. _________________________________________________________________
  109. dropout_3 (Dropout) (None, None, 640) 0
  110. _________________________________________________________________
  111. layer_normalization_3 (Layer (None, None, 640) 1280
  112. _________________________________________________________________
  113. rnn_4 (RNN) (None, None, 640) 11804672
  114. _________________________________________________________________
  115. dropout_4 (Dropout) (None, None, 640) 0
  116. _________________________________________________________________
  117. layer_normalization_4 (Layer (None, None, 640) 1280
  118. _________________________________________________________________
  119. rnn_5 (RNN) (None, None, 640) 11804672
  120. _________________________________________________________________
  121. dropout_5 (Dropout) (None, None, 640) 0
  122. _________________________________________________________________
  123. layer_normalization_5 (Layer (None, None, 640) 1280
  124. _________________________________________________________________
  125. rnn_6 (RNN) (None, None, 640) 11804672
  126. _________________________________________________________________
  127. dropout_6 (Dropout) (None, None, 640) 0
  128. _________________________________________________________________
  129. layer_normalization_6 (Layer (None, None, 640) 1280
  130. _________________________________________________________________
  131. rnn_7 (RNN) (None, None, 640) 11804672
  132. _________________________________________________________________
  133. dropout_7 (Dropout) (None, None, 640) 0
  134. _________________________________________________________________
  135. layer_normalization_7 (Layer (None, None, 640) 1280
  136. =================================================================
  137. Total params: 96,414,656
  138. Trainable params: 96,414,176
  139. Non-trainable params: 480
  140. _________________________________________________________________
  141. Model: "prediction_network"
  142. _________________________________________________________________
  143. Layer (type) Output Shape Param #
  144. =================================================================
  145. input_2 (InputLayer) [(None, None)] 0
  146. _________________________________________________________________
  147. embedding (Embedding) (None, None, 500) 2180500
  148. _________________________________________________________________
  149. rnn_8 (RNN) (None, None, 640) 10657792
  150. _________________________________________________________________
  151. dropout_8 (Dropout) (None, None, 640) 0
  152. _________________________________________________________________
  153. layer_normalization_8 (Layer (None, None, 640) 1280
  154. _________________________________________________________________
  155. rnn_9 (RNN) (None, None, 640) 11804672
  156. _________________________________________________________________
  157. dropout_9 (Dropout) (None, None, 640) 0
  158. _________________________________________________________________
  159. layer_normalization_9 (Layer (None, None, 640) 1280
  160. =================================================================
  161. Total params: 24,645,524
  162. Trainable params: 24,645,524
  163. Non-trainable params: 0
  164. _________________________________________________________________
  165. Model: "transducer"
  166. __________________________________________________________________________________________________
  167. Layer (type) Output Shape Param # Connected to
  168. ==================================================================================================
  169. mel_specs (InputLayer) [(None, None, 240)] 0
  170. __________________________________________________________________________________________________
  171. pred_inp (InputLayer) [(None, None)] 0
  172. __________________________________________________________________________________________________
  173. encoder (Model) (None, None, 640) 96414656 mel_specs[0][0]
  174. __________________________________________________________________________________________________
  175. prediction_network (Model) (None, None, 640) 24645524 pred_inp[0][0]
  176. __________________________________________________________________________________________________
  177. tf_op_layer_ExpandDims (TensorF [(None, None, 1, 640 0 encoder[1][0]
  178. __________________________________________________________________________________________________
  179. tf_op_layer_ExpandDims_1 (Tenso [(None, 1, None, 640 0 prediction_network[1][0]
  180. __________________________________________________________________________________________________
  181. tf_op_layer_AddV2 (TensorFlowOp [(None, None, None, 0 tf_op_layer_ExpandDims[0][0]
  182. tf_op_layer_ExpandDims_1[0][0]
  183. __________________________________________________________________________________________________
  184. dense (Dense) (None, None, None, 6 410240 tf_op_layer_AddV2[0][0]
  185. __________________________________________________________________________________________________
  186. dense_1 (Dense) (None, None, None, 4 2795401 dense[0][0]
  187. ==================================================================================================
  188. Total params: 124,265,821
  189. Trainable params: 124,265,341
  190. Non-trainable params: 480
  191. __________________________________________________________________________________________________
  192. Starting training.
  193. Performing evaluation.
  194. 2020-06-02 12:39:39.827175: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10
  195. 2020-06-02 12:39:40.057496: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7
  196. VALIDATION RESULTS: Time: 103.5925, Loss: 619.9272, Accuracy: 0.0026, WER: 1.0000
  197. Saving checkpoint ./model/checkpoint_0_619.9272.hdf5
  198. Epoch: 0, Batch: 0, Global Step: 0, Step Time: 17.4600, Loss: 615.8542
  199. Epoch: 0, Batch: 1, Global Step: 1, Step Time: 3.3450, Loss: 418.8870
  200. Epoch: 0, Batch: 2, Global Step: 2, Step Time: 3.5043, Loss: 300.2598
  201. Epoch: 0, Batch: 3, Global Step: 3, Step Time: 3.5141, Loss: 237.2694
  202. Epoch: 0, Batch: 4, Global Step: 4, Step Time: 3.4132, Loss: 201.3490
  203. Epoch: 0, Batch: 5, Global Step: 5, Step Time: 4.8524, Loss: 179.2227
  204. Epoch: 0, Batch: 6, Global Step: 6, Step Time: 3.3619, Loss: 161.7426
  205. Epoch: 0, Batch: 7, Global Step: 7, Step Time: 3.1144, Loss: 149.4769
  206. Epoch: 0, Batch: 8, Global Step: 8, Step Time: 4.0565, Loss: 139.3051
  207. Epoch: 0, Batch: 9, Global Step: 9, Step Time: 4.1282, Loss: 130.8080
  208. Epoch: 0, Batch: 10, Global Step: 10, Step Time: 4.7917, Loss: 124.7013
  209. Epoch: 0, Batch: 11, Global Step: 11, Step Time: 4.9635, Loss: 119.0299
  210. Epoch: 0, Batch: 12, Global Step: 12, Step Time: 5.7238, Loss: 113.2522
  211. Epoch: 0, Batch: 13, Global Step: 13, Step Time: 3.7156, Loss: 108.6988
  212. Epoch: 0, Batch: 14, Global Step: 14, Step Time: 4.4637, Loss: 104.8368
  213. Epoch: 0, Batch: 15, Global Step: 15, Step Time: 5.0303, Loss: 101.6355
  214. Epoch: 0, Batch: 16, Global Step: 16, Step Time: 5.5570, Loss: 98.6159
  215. Epoch: 0, Batch: 17, Global Step: 17, Step Time: 5.4559, Loss: 96.1818
  216. Epoch: 0, Batch: 18, Global Step: 18, Step Time: 4.5417, Loss: 93.4029
  217. Epoch: 0, Batch: 19, Global Step: 19, Step Time: 4.4488, Loss: 91.4843
  218. Epoch: 0, Batch: 20, Global Step: 20, Step Time: 6.9100, Loss: 89.2137
  219. Epoch: 0, Batch: 21, Global Step: 21, Step Time: 4.7161, Loss: 87.1073
  220. Epoch: 0, Batch: 22, Global Step: 22, Step Time: 5.8719, Loss: 85.3582
  221. Epoch: 0, Batch: 23, Global Step: 23, Step Time: 3.9750, Loss: 83.3624
  222. Epoch: 0, Batch: 24, Global Step: 24, Step Time: 4.5176, Loss: 82.9920
  223. Epoch: 0, Batch: 25, Global Step: 25, Step Time: 4.0576, Loss: 81.7212
  224. Epoch: 0, Batch: 26, Global Step: 26, Step Time: 4.2880, Loss: 80.4431
  225. Epoch: 0, Batch: 27, Global Step: 27, Step Time: 3.3830, Loss: 79.1331
  226. Epoch: 0, Batch: 28, Global Step: 28, Step Time: 3.4919, Loss: 77.8691
  227. Epoch: 0, Batch: 29, Global Step: 29, Step Time: 3.8900, Loss: 76.7811
  228. Epoch: 0, Batch: 30, Global Step: 30, Step Time: 4.2739, Loss: 75.7115
  229. Epoch: 0, Batch: 31, Global Step: 31, Step Time: 3.2011, Loss: 74.6786
  230. Epoch: 0, Batch: 32, Global Step: 32, Step Time: 3.9784, Loss: 73.7630
  231. Epoch: 0, Batch: 33, Global Step: 33, Step Time: 3.9261, Loss: 72.8322
  232. Epoch: 0, Batch: 34, Global Step: 34, Step Time: 4.0538, Loss: 72.1079
  233. Epoch: 0, Batch: 35, Global Step: 35, Step Time: 3.4979, Loss: 71.5096
  234. Epoch: 0, Batch: 36, Global Step: 36, Step Time: 3.8859, Loss: 70.7836
  235. Epoch: 0, Batch: 37, Global Step: 37, Step Time: 4.9590, Loss: 70.1555
  236. Epoch: 0, Batch: 38, Global Step: 38, Step Time: 5.4815, Loss: 69.7584
  237. Epoch: 0, Batch: 39, Global Step: 39, Step Time: 4.4400, Loss: 69.2089
  238. Epoch: 0, Batch: 40, Global Step: 40, Step Time: 2.8504, Loss: 68.5661
  239. Epoch: 0, Batch: 41, Global Step: 41, Step Time: 4.3475, Loss: 68.3671
  240. Epoch: 0, Batch: 42, Global Step: 42, Step Time: 4.3060, Loss: 67.8474
  241. Epoch: 0, Batch: 43, Global Step: 43, Step Time: 5.2489, Loss: 67.4526
  242. Epoch: 0, Batch: 44, Global Step: 44, Step Time: 5.7671, Loss: 66.9959
  243. Epoch: 0, Batch: 45, Global Step: 45, Step Time: 7.0478, Loss: 66.6853
  244. Epoch: 0, Batch: 46, Global Step: 46, Step Time: 4.9578, Loss: 66.1128
  245. Epoch: 0, Batch: 47, Global Step: 47, Step Time: 5.4014, Loss: 65.6809
  246. Epoch: 0, Batch: 48, Global Step: 48, Step Time: 4.1432, Loss: 65.3234
  247. Epoch: 0, Batch: 49, Global Step: 49, Step Time: 3.4952, Loss: 64.8235
  248. Epoch: 0, Batch: 50, Global Step: 50, Step Time: 5.5742, Loss: 64.4003
  249. Epoch: 0, Batch: 51, Global Step: 51, Step Time: 7.2349, Loss: 64.2064
  250. Epoch: 0, Batch: 52, Global Step: 52, Step Time: 6.0429, Loss: 63.9245
  251. Epoch: 0, Batch: 53, Global Step: 53, Step Time: 6.7405, Loss: 63.5588
  252. Epoch: 0, Batch: 54, Global Step: 54, Step Time: 5.7055, Loss: 63.1586
  253. Epoch: 0, Batch: 55, Global Step: 55, Step Time: 4.0438, Loss: 62.7867
  254. Epoch: 0, Batch: 56, Global Step: 56, Step Time: 3.8366, Loss: 62.3413
  255. Epoch: 0, Batch: 57, Global Step: 57, Step Time: 3.9731, Loss: 62.0851
  256. Epoch: 0, Batch: 58, Global Step: 58, Step Time: 3.5970, Loss: 61.7060
  257. Epoch: 0, Batch: 59, Global Step: 59, Step Time: 3.0265, Loss: 61.3425
  258. Epoch: 0, Batch: 60, Global Step: 60, Step Time: 3.6770, Loss: 61.0204
  259. Epoch: 0, Batch: 61, Global Step: 61, Step Time: 4.0022, Loss: 60.6875
  260. Epoch: 0, Batch: 62, Global Step: 62, Step Time: 4.2404, Loss: 60.3401
  261. Epoch: 0, Batch: 63, Global Step: 63, Step Time: 4.6340, Loss: 60.2264
  262. Epoch: 0, Batch: 64, Global Step: 64, Step Time: 7.1053, Loss: 60.0268
  263. Epoch: 0, Batch: 65, Global Step: 65, Step Time: 4.2710, Loss: 59.6804
  264. Epoch: 0, Batch: 66, Global Step: 66, Step Time: 3.0085, Loss: 59.3344
  265. Epoch: 0, Batch: 67, Global Step: 67, Step Time: 4.3059, Loss: 59.1096
  266. Epoch: 0, Batch: 68, Global Step: 68, Step Time: 4.7841, Loss: 58.8809
  267. Epoch: 0, Batch: 69, Global Step: 69, Step Time: 5.4732, Loss: 58.6091
  268. Epoch: 0, Batch: 70, Global Step: 70, Step Time: 8.0313, Loss: 58.4871
  269. Epoch: 0, Batch: 71, Global Step: 71, Step Time: 3.6355, Loss: 58.2482
  270. Epoch: 0, Batch: 72, Global Step: 72, Step Time: 3.7163, Loss: 57.9719
  271. Epoch: 0, Batch: 73, Global Step: 73, Step Time: 3.6176, Loss: 57.7342
  272. Epoch: 0, Batch: 74, Global Step: 74, Step Time: 4.3009, Loss: 57.5076
  273. Epoch: 0, Batch: 75, Global Step: 75, Step Time: 5.1885, Loss: 57.3367
  274. Epoch: 0, Batch: 76, Global Step: 76, Step Time: 5.9316, Loss: 57.1777
  275. Epoch: 0, Batch: 77, Global Step: 77, Step Time: 5.3696, Loss: 57.0283
  276. Epoch: 0, Batch: 78, Global Step: 78, Step Time: 4.7314, Loss: 56.7757
  277. Epoch: 0, Batch: 79, Global Step: 79, Step Time: 7.9406, Loss: 56.7247
  278. Epoch: 0, Batch: 80, Global Step: 80, Step Time: 5.0424, Loss: 56.5345
  279. Epoch: 0, Batch: 81, Global Step: 81, Step Time: 4.4104, Loss: 56.3766
  280. Epoch: 0, Batch: 82, Global Step: 82, Step Time: 5.8465, Loss: 56.2667
  281. Epoch: 0, Batch: 83, Global Step: 83, Step Time: 4.6535, Loss: 56.1198
  282. Epoch: 0, Batch: 84, Global Step: 84, Step Time: 7.6613, Loss: 56.2069
  283. Epoch: 0, Batch: 85, Global Step: 85, Step Time: 4.2678, Loss: 56.0140
  284. Epoch: 0, Batch: 86, Global Step: 86, Step Time: 5.9622, Loss: 56.0243
  285. Epoch: 0, Batch: 87, Global Step: 87, Step Time: 5.4177, Loss: 55.8146
  286. Epoch: 0, Batch: 88, Global Step: 88, Step Time: 5.5886, Loss: 55.6539
  287. Epoch: 0, Batch: 89, Global Step: 89, Step Time: 5.7393, Loss: 55.5688
  288. Epoch: 0, Batch: 90, Global Step: 90, Step Time: 4.0525, Loss: 55.4420
  289. Epoch: 0, Batch: 91, Global Step: 91, Step Time: 3.8019, Loss: 55.2737
  290. Epoch: 0, Batch: 92, Global Step: 92, Step Time: 5.4556, Loss: 55.1883
  291. Epoch: 0, Batch: 93, Global Step: 93, Step Time: 4.7190, Loss: 55.1015
  292. Epoch: 0, Batch: 94, Global Step: 94, Step Time: 3.2857, Loss: 54.9427
  293. Epoch: 0, Batch: 95, Global Step: 95, Step Time: 4.0954, Loss: 54.7910
  294. Epoch: 0, Batch: 96, Global Step: 96, Step Time: 3.7984, Loss: 54.6642
  295. Epoch: 0, Batch: 97, Global Step: 97, Step Time: 5.6758, Loss: 54.6078
  296. Epoch: 0, Batch: 98, Global Step: 98, Step Time: 3.4072, Loss: 54.4126
  297. Epoch: 0, Batch: 99, Global Step: 99, Step Time: 6.0933, Loss: 54.3168
  298. Epoch: 0, Batch: 100, Global Step: 100, Step Time: 6.3159, Loss: 54.1921
  299. Epoch: 0, Batch: 101, Global Step: 101, Step Time: 3.9769, Loss: 54.0803
  300. Epoch: 0, Batch: 102, Global Step: 102, Step Time: 6.5820, Loss: 53.9689
  301. Epoch: 0, Batch: 103, Global Step: 103, Step Time: 3.8121, Loss: 53.7948
  302. Epoch: 0, Batch: 104, Global Step: 104, Step Time: 3.3348, Loss: 53.7050
  303. Epoch: 0, Batch: 105, Global Step: 105, Step Time: 5.9990, Loss: 53.5807
  304. Epoch: 0, Batch: 106, Global Step: 106, Step Time: 6.2281, Loss: 53.4820
  305. Epoch: 0, Batch: 107, Global Step: 107, Step Time: 4.4506, Loss: 53.4036
  306. Epoch: 0, Batch: 108, Global Step: 108, Step Time: 4.6799, Loss: 53.2839
  307. Epoch: 0, Batch: 109, Global Step: 109, Step Time: 4.7772, Loss: 53.2067
  308. Epoch: 0, Batch: 110, Global Step: 110, Step Time: 3.0594, Loss: 53.0966
  309. Epoch: 0, Batch: 111, Global Step: 111, Step Time: 4.4572, Loss: 53.0021
  310. Epoch: 0, Batch: 112, Global Step: 112, Step Time: 3.5754, Loss: 52.8859
  311. Epoch: 0, Batch: 113, Global Step: 113, Step Time: 6.2513, Loss: 52.8310
  312. Epoch: 0, Batch: 114, Global Step: 114, Step Time: 6.3081, Loss: 52.8039
  313. Epoch: 0, Batch: 115, Global Step: 115, Step Time: 3.7045, Loss: 52.6757
  314. Epoch: 0, Batch: 116, Global Step: 116, Step Time: 3.5276, Loss: 52.5768
  315. Epoch: 0, Batch: 117, Global Step: 117, Step Time: 3.7667, Loss: 52.4693
  316. Epoch: 0, Batch: 118, Global Step: 118, Step Time: 5.0514, Loss: 52.3990
  317. Epoch: 0, Batch: 119, Global Step: 119, Step Time: 6.0728, Loss: 52.3360
  318. Epoch: 0, Batch: 120, Global Step: 120, Step Time: 4.1809, Loss: 52.2536
  319. Epoch: 0, Batch: 121, Global Step: 121, Step Time: 4.4953, Loss: 52.1061
  320. Epoch: 0, Batch: 122, Global Step: 122, Step Time: 5.2279, Loss: 52.0350
  321. Epoch: 0, Batch: 123, Global Step: 123, Step Time: 6.0672, Loss: 52.1202
  322. Epoch: 0, Batch: 124, Global Step: 124, Step Time: 4.5031, Loss: 52.0105
  323. Epoch: 0, Batch: 125, Global Step: 125, Step Time: 3.9633, Loss: 51.9429
  324. Epoch: 0, Batch: 126, Global Step: 126, Step Time: 3.9144, Loss: 51.8033
  325. EPOCH RESULTS: Loss: 51.8033
  326. Epoch: 1, Batch: 0, Global Step: 127, Step Time: 4.5280, Loss: 41.0299
  327. Epoch: 1, Batch: 1, Global Step: 128, Step Time: 3.3459, Loss: 40.4037
  328. Epoch: 1, Batch: 2, Global Step: 129, Step Time: 3.4649, Loss: 38.4516
  329. Epoch: 1, Batch: 3, Global Step: 130, Step Time: 3.4913, Loss: 38.3625
  330. Epoch: 1, Batch: 4, Global Step: 131, Step Time: 3.4138, Loss: 38.3201
  331. Epoch: 1, Batch: 5, Global Step: 132, Step Time: 4.8084, Loss: 38.7250
  332. Epoch: 1, Batch: 6, Global Step: 133, Step Time: 3.3385, Loss: 38.0784
  333. Epoch: 1, Batch: 7, Global Step: 134, Step Time: 3.1439, Loss: 38.1917
  334. Epoch: 1, Batch: 8, Global Step: 135, Step Time: 4.0436, Loss: 38.3183
  335. Epoch: 1, Batch: 9, Global Step: 136, Step Time: 4.0496, Loss: 38.9261
  336. Epoch: 1, Batch: 10, Global Step: 137, Step Time: 4.6444, Loss: 39.7749
  337. Epoch: 1, Batch: 11, Global Step: 138, Step Time: 4.8208, Loss: 40.3196
  338. Epoch: 1, Batch: 12, Global Step: 139, Step Time: 5.7634, Loss: 40.1927
  339. Epoch: 1, Batch: 13, Global Step: 140, Step Time: 3.7849, Loss: 40.1886
  340. Epoch: 1, Batch: 14, Global Step: 141, Step Time: 4.4266, Loss: 40.1873
  341. Epoch: 1, Batch: 15, Global Step: 142, Step Time: 4.9419, Loss: 40.4661
  342. Epoch: 1, Batch: 16, Global Step: 143, Step Time: 5.4437, Loss: 40.5319
  343. Epoch: 1, Batch: 17, Global Step: 144, Step Time: 5.2795, Loss: 40.9769
  344. Epoch: 1, Batch: 18, Global Step: 145, Step Time: 4.4209, Loss: 40.7711
  345. Epoch: 1, Batch: 19, Global Step: 146, Step Time: 4.4655, Loss: 41.0544
  346. Epoch: 1, Batch: 20, Global Step: 147, Step Time: 7.1387, Loss: 40.8978
  347. Epoch: 1, Batch: 21, Global Step: 148, Step Time: 4.6678, Loss: 40.7591
  348. Epoch: 1, Batch: 22, Global Step: 149, Step Time: 5.7215, Loss: 40.7798
  349. Epoch: 1, Batch: 23, Global Step: 150, Step Time: 3.9766, Loss: 40.4560
  350. Epoch: 1, Batch: 24, Global Step: 151, Step Time: 4.5016, Loss: 41.4924
  351. Epoch: 1, Batch: 25, Global Step: 152, Step Time: 4.0980, Loss: 41.6026
  352. Epoch: 1, Batch: 26, Global Step: 153, Step Time: 4.1603, Loss: 41.6856
  353. Epoch: 1, Batch: 27, Global Step: 154, Step Time: 3.3935, Loss: 41.5738
  354. Epoch: 1, Batch: 28, Global Step: 155, Step Time: 3.4759, Loss: 41.4391
  355. Epoch: 1, Batch: 29, Global Step: 156, Step Time: 3.9917, Loss: 41.4245
  356. Epoch: 1, Batch: 30, Global Step: 157, Step Time: 4.1791, Loss: 41.4163
  357. Epoch: 1, Batch: 31, Global Step: 158, Step Time: 3.2299, Loss: 41.3184
  358. Epoch: 1, Batch: 32, Global Step: 159, Step Time: 3.9096, Loss: 41.2796
  359. Epoch: 1, Batch: 33, Global Step: 160, Step Time: 4.0226, Loss: 41.1804
  360. Epoch: 1, Batch: 34, Global Step: 161, Step Time: 4.0280, Loss: 41.1915
  361. Epoch: 1, Batch: 35, Global Step: 162, Step Time: 3.4639, Loss: 41.3346
  362. Epoch: 1, Batch: 36, Global Step: 163, Step Time: 3.7873, Loss: 41.3234
  363. Epoch: 1, Batch: 37, Global Step: 164, Step Time: 5.0237, Loss: 41.3922
  364. Epoch: 1, Batch: 38, Global Step: 165, Step Time: 5.4475, Loss: 41.6311
  365. Epoch: 1, Batch: 39, Global Step: 166, Step Time: 4.5639, Loss: 41.6785
  366. Epoch: 1, Batch: 40, Global Step: 167, Step Time: 2.8650, Loss: 41.6390
  367. Epoch: 1, Batch: 41, Global Step: 168, Step Time: 4.3623, Loss: 42.0130
  368. Epoch: 1, Batch: 42, Global Step: 169, Step Time: 4.2220, Loss: 42.0417
  369. Epoch: 1, Batch: 43, Global Step: 170, Step Time: 5.0899, Loss: 42.1780
  370. Epoch: 1, Batch: 44, Global Step: 171, Step Time: 5.5955, Loss: 42.2461
  371. Epoch: 1, Batch: 45, Global Step: 172, Step Time: 7.1386, Loss: 42.3923
  372. Epoch: 1, Batch: 46, Global Step: 173, Step Time: 4.8495, Loss: 42.2824
  373. Epoch: 1, Batch: 47, Global Step: 174, Step Time: 5.2586, Loss: 42.2820
  374. Epoch: 1, Batch: 48, Global Step: 175, Step Time: 4.2670, Loss: 42.3529
  375. Epoch: 1, Batch: 49, Global Step: 176, Step Time: 3.4313, Loss: 42.2585
  376. Epoch: 1, Batch: 50, Global Step: 177, Step Time: 5.4462, Loss: 42.2303
  377. Epoch: 1, Batch: 51, Global Step: 178, Step Time: 7.1237, Loss: 42.4156
  378. Epoch: 1, Batch: 52, Global Step: 179, Step Time: 5.9035, Loss: 42.4959
  379. Epoch: 1, Batch: 53, Global Step: 180, Step Time: 6.8359, Loss: 42.4882
  380. Epoch: 1, Batch: 54, Global Step: 181, Step Time: 5.9514, Loss: 42.4188
  381. Epoch: 1, Batch: 55, Global Step: 182, Step Time: 3.9963, Loss: 42.3700
  382. Epoch: 1, Batch: 56, Global Step: 183, Step Time: 3.7983, Loss: 42.2403
  383. Epoch: 1, Batch: 57, Global Step: 184, Step Time: 4.0604, Loss: 42.2880
  384. Epoch: 1, Batch: 58, Global Step: 185, Step Time: 3.6909, Loss: 42.2075
  385. Epoch: 1, Batch: 59, Global Step: 186, Step Time: 3.1205, Loss: 42.1377
  386. Epoch: 1, Batch: 60, Global Step: 187, Step Time: 3.8507, Loss: 42.0965
  387. Epoch: 1, Batch: 61, Global Step: 188, Step Time: 4.1303, Loss: 42.0353
  388. Epoch: 1, Batch: 62, Global Step: 189, Step Time: 4.1350, Loss: 41.9496
  389. Epoch: 1, Batch: 63, Global Step: 190, Step Time: 4.5164, Loss: 42.0787
  390. Epoch: 1, Batch: 64, Global Step: 191, Step Time: 7.2637, Loss: 42.1246
  391. Epoch: 1, Batch: 65, Global Step: 192, Step Time: 4.4172, Loss: 42.0364
  392. Epoch: 1, Batch: 66, Global Step: 193, Step Time: 3.1211, Loss: 41.9303
  393. Epoch: 1, Batch: 67, Global Step: 194, Step Time: 4.2895, Loss: 41.9362
  394. Epoch: 1, Batch: 68, Global Step: 195, Step Time: 4.8808, Loss: 41.9162
  395. Epoch: 1, Batch: 69, Global Step: 196, Step Time: 5.5674, Loss: 41.8625
  396. Epoch: 1, Batch: 70, Global Step: 197, Step Time: 8.1922, Loss: 41.9482
  397. Epoch: 1, Batch: 71, Global Step: 198, Step Time: 3.7597, Loss: 41.9121
  398. Epoch: 1, Batch: 72, Global Step: 199, Step Time: 3.8707, Loss: 41.8416
  399. Epoch: 1, Batch: 73, Global Step: 200, Step Time: 3.6827, Loss: 41.8033
  400. Epoch: 1, Batch: 74, Global Step: 201, Step Time: 4.4511, Loss: 41.7682
  401. Epoch: 1, Batch: 75, Global Step: 202, Step Time: 5.1966, Loss: 41.7817
  402. Epoch: 1, Batch: 76, Global Step: 203, Step Time: 6.0521, Loss: 41.8081
  403. Epoch: 1, Batch: 77, Global Step: 204, Step Time: 5.6275, Loss: 41.8363
  404. Epoch: 1, Batch: 78, Global Step: 205, Step Time: 4.9260, Loss: 41.7624
  405. Epoch: 1, Batch: 79, Global Step: 206, Step Time: 8.2677, Loss: 41.8786
  406. Epoch: 1, Batch: 80, Global Step: 207, Step Time: 5.2911, Loss: 41.8579
  407. Epoch: 1, Batch: 81, Global Step: 208, Step Time: 4.3448, Loss: 41.8592
  408. Epoch: 1, Batch: 82, Global Step: 209, Step Time: 5.6327, Loss: 41.9027
  409. Epoch: 1, Batch: 83, Global Step: 210, Step Time: 4.5322, Loss: 41.9048
  410. Epoch: 1, Batch: 84, Global Step: 211, Step Time: 7.7124, Loss: 42.1412
  411. Epoch: 1, Batch: 85, Global Step: 212, Step Time: 4.2112, Loss: 42.1011
  412. Epoch: 1, Batch: 86, Global Step: 213, Step Time: 5.9185, Loss: 42.2463
  413. Epoch: 1, Batch: 87, Global Step: 214, Step Time: 5.4584, Loss: 42.1816
  414. Epoch: 1, Batch: 88, Global Step: 215, Step Time: 5.6058, Loss: 42.1567
  415. Epoch: 1, Batch: 89, Global Step: 216, Step Time: 5.7220, Loss: 42.2057
  416. Epoch: 1, Batch: 90, Global Step: 217, Step Time: 4.0316, Loss: 42.2120
  417. Epoch: 1, Batch: 91, Global Step: 218, Step Time: 3.8353, Loss: 42.1677
  418. Epoch: 1, Batch: 92, Global Step: 219, Step Time: 5.5320, Loss: 42.2038
  419. Epoch: 1, Batch: 93, Global Step: 220, Step Time: 4.7280, Loss: 42.2370
  420. Epoch: 1, Batch: 94, Global Step: 221, Step Time: 3.3270, Loss: 42.2005
  421. Epoch: 1, Batch: 95, Global Step: 222, Step Time: 4.0952, Loss: 42.1717
  422. Epoch: 1, Batch: 96, Global Step: 223, Step Time: 3.8084, Loss: 42.1608
  423. Epoch: 1, Batch: 97, Global Step: 224, Step Time: 5.6323, Loss: 42.2175
  424. Epoch: 1, Batch: 98, Global Step: 225, Step Time: 3.3604, Loss: 42.1369
  425. Epoch: 1, Batch: 99, Global Step: 226, Step Time: 6.1472, Loss: 42.1462
  426. Epoch: 1, Batch: 100, Global Step: 227, Step Time: 6.3347, Loss: 42.1290
  427. Epoch: 1, Batch: 101, Global Step: 228, Step Time: 4.0098, Loss: 42.1179
  428. Epoch: 1, Batch: 102, Global Step: 229, Step Time: 6.5823, Loss: 42.1129
  429. Epoch: 1, Batch: 103, Global Step: 230, Step Time: 3.8138, Loss: 42.0424
  430. Epoch: 1, Batch: 104, Global Step: 231, Step Time: 3.3242, Loss: 42.0515
  431. Epoch: 1, Batch: 105, Global Step: 232, Step Time: 6.0314, Loss: 42.0295
  432. Epoch: 1, Batch: 106, Global Step: 233, Step Time: 6.2573, Loss: 42.0304
  433. Epoch: 1, Batch: 107, Global Step: 234, Step Time: 4.4159, Loss: 42.0442
  434. Epoch: 1, Batch: 108, Global Step: 235, Step Time: 4.6733, Loss: 42.0135
  435. Epoch: 1, Batch: 109, Global Step: 236, Step Time: 4.7508, Loss: 42.0269
  436. Epoch: 1, Batch: 110, Global Step: 237, Step Time: 3.0713, Loss: 41.9998
  437. Epoch: 1, Batch: 111, Global Step: 238, Step Time: 4.4708, Loss: 41.9917
  438. Epoch: 1, Batch: 112, Global Step: 239, Step Time: 3.6076, Loss: 41.9595
  439. Epoch: 1, Batch: 113, Global Step: 240, Step Time: 6.3120, Loss: 41.9878
  440. Epoch: 1, Batch: 114, Global Step: 241, Step Time: 6.3035, Loss: 42.0435
  441. Epoch: 1, Batch: 115, Global Step: 242, Step Time: 3.7218, Loss: 42.0028
  442. Epoch: 1, Batch: 116, Global Step: 243, Step Time: 3.4961, Loss: 41.9865
  443. Epoch: 1, Batch: 117, Global Step: 244, Step Time: 3.7387, Loss: 41.9554
  444. Epoch: 1, Batch: 118, Global Step: 245, Step Time: 5.0289, Loss: 41.9605
  445. Epoch: 1, Batch: 119, Global Step: 246, Step Time: 6.0470, Loss: 41.9727
  446. Epoch: 1, Batch: 120, Global Step: 247, Step Time: 4.1332, Loss: 41.9691
  447. Epoch: 1, Batch: 121, Global Step: 248, Step Time: 4.4576, Loss: 41.8992
  448. Epoch: 1, Batch: 122, Global Step: 249, Step Time: 5.1228, Loss: 41.9017
  449. Epoch: 1, Batch: 123, Global Step: 250, Step Time: 6.0759, Loss: 42.0528
  450. Epoch: 1, Batch: 124, Global Step: 251, Step Time: 4.5302, Loss: 42.0186
  451. Epoch: 1, Batch: 125, Global Step: 252, Step Time: 3.9834, Loss: 42.0229
  452. Epoch: 1, Batch: 126, Global Step: 253, Step Time: 3.7640, Loss: 41.9508
  453. EPOCH RESULTS: Loss: 41.9508
  454. Epoch: 2, Batch: 0, Global Step: 254, Step Time: 4.6736, Loss: 39.8656
  455. Epoch: 2, Batch: 1, Global Step: 255, Step Time: 3.2615, Loss: 39.1762
  456. Epoch: 2, Batch: 2, Global Step: 256, Step Time: 3.3628, Loss: 37.3431
  457. Epoch: 2, Batch: 3, Global Step: 257, Step Time: 3.4188, Loss: 37.3042
  458. Epoch: 2, Batch: 4, Global Step: 258, Step Time: 3.5300, Loss: 37.2389
  459. Epoch: 2, Batch: 5, Global Step: 259, Step Time: 4.7616, Loss: 37.6194
  460. Epoch: 2, Batch: 6, Global Step: 260, Step Time: 3.4297, Loss: 37.0383
  461. Epoch: 2, Batch: 7, Global Step: 261, Step Time: 3.1704, Loss: 37.2073
  462. Epoch: 2, Batch: 8, Global Step: 262, Step Time: 4.1870, Loss: 37.3948
  463. Epoch: 2, Batch: 9, Global Step: 263, Step Time: 4.0772, Loss: 38.0003
  464. Epoch: 2, Batch: 10, Global Step: 264, Step Time: 4.6883, Loss: 38.8279
  465. Epoch: 2, Batch: 11, Global Step: 265, Step Time: 4.9602, Loss: 39.3292
  466. Epoch: 2, Batch: 12, Global Step: 266, Step Time: 5.6594, Loss: 39.2252
  467. Epoch: 2, Batch: 13, Global Step: 267, Step Time: 3.6512, Loss: 39.2225
  468. Epoch: 2, Batch: 14, Global Step: 268, Step Time: 4.4392, Loss: 39.2183
  469. Epoch: 2, Batch: 15, Global Step: 269, Step Time: 4.9514, Loss: 39.4945
  470. Epoch: 2, Batch: 16, Global Step: 270, Step Time: 5.4636, Loss: 39.5571
  471. Epoch: 2, Batch: 17, Global Step: 271, Step Time: 5.2954, Loss: 39.9544
  472. Epoch: 2, Batch: 18, Global Step: 272, Step Time: 4.4941, Loss: 39.7638
  473. Epoch: 2, Batch: 19, Global Step: 273, Step Time: 4.3949, Loss: 40.0443
  474. Epoch: 2, Batch: 20, Global Step: 274, Step Time: 6.8493, Loss: 39.8841
  475. Epoch: 2, Batch: 21, Global Step: 275, Step Time: 4.6276, Loss: 39.7400
  476. Epoch: 2, Batch: 22, Global Step: 276, Step Time: 5.7722, Loss: 39.7626
  477. Epoch: 2, Batch: 23, Global Step: 277, Step Time: 3.9320, Loss: 39.4605
  478. Epoch: 2, Batch: 24, Global Step: 278, Step Time: 4.4658, Loss: 40.4870
  479. Epoch: 2, Batch: 25, Global Step: 279, Step Time: 3.9503, Loss: 40.5820
  480. Epoch: 2, Batch: 26, Global Step: 280, Step Time: 4.1607, Loss: 40.6804
  481. Epoch: 2, Batch: 27, Global Step: 281, Step Time: 3.3614, Loss: 40.5798
  482. Epoch: 2, Batch: 28, Global Step: 282, Step Time: 3.4463, Loss: 40.4635
  483. Epoch: 2, Batch: 29, Global Step: 283, Step Time: 3.8246, Loss: 40.4433
  484. Epoch: 2, Batch: 30, Global Step: 284, Step Time: 4.2286, Loss: 40.4343
  485. Epoch: 2, Batch: 31, Global Step: 285, Step Time: 3.1467, Loss: 40.3360
  486. Epoch: 2, Batch: 32, Global Step: 286, Step Time: 3.9112, Loss: 40.3097
  487. Epoch: 2, Batch: 33, Global Step: 287, Step Time: 3.8944, Loss: 40.2224
  488. Epoch: 2, Batch: 34, Global Step: 288, Step Time: 4.0116, Loss: 40.2348
  489. Epoch: 2, Batch: 35, Global Step: 289, Step Time: 3.3753, Loss: 40.3689
  490. Epoch: 2, Batch: 36, Global Step: 290, Step Time: 3.8617, Loss: 40.3568
  491. Epoch: 2, Batch: 37, Global Step: 291, Step Time: 4.8852, Loss: 40.4396
  492. Epoch: 2, Batch: 38, Global Step: 292, Step Time: 5.3736, Loss: 40.6780
  493. Epoch: 2, Batch: 39, Global Step: 293, Step Time: 4.4003, Loss: 40.7289
  494. Epoch: 2, Batch: 40, Global Step: 294, Step Time: 2.7896, Loss: 40.6939
  495. Epoch: 2, Batch: 41, Global Step: 295, Step Time: 4.2666, Loss: 41.0762
  496. Epoch: 2, Batch: 42, Global Step: 296, Step Time: 4.2467, Loss: 41.1013
  497. Epoch: 2, Batch: 43, Global Step: 297, Step Time: 5.1698, Loss: 41.2424
  498. Epoch: 2, Batch: 44, Global Step: 298, Step Time: 5.6098, Loss: 41.3169
  499. Epoch: 2, Batch: 45, Global Step: 299, Step Time: 6.8951, Loss: 41.4604
  500. Epoch: 2, Batch: 46, Global Step: 300, Step Time: 4.8454, Loss: 41.3571
  501. Epoch: 2, Batch: 47, Global Step: 301, Step Time: 5.2363, Loss: 41.3566
  502. Epoch: 2, Batch: 48, Global Step: 302, Step Time: 4.0571, Loss: 41.4309
  503. Epoch: 2, Batch: 49, Global Step: 303, Step Time: 3.4727, Loss: 41.3353
  504. Epoch: 2, Batch: 50, Global Step: 304, Step Time: 5.4702, Loss: 41.3103
  505. Epoch: 2, Batch: 51, Global Step: 305, Step Time: 7.1278, Loss: 41.5024
  506. Epoch: 2, Batch: 52, Global Step: 306, Step Time: 6.0423, Loss: 41.5947
  507. Epoch: 2, Batch: 53, Global Step: 307, Step Time: 6.6917, Loss: 41.5907
  508. Epoch: 2, Batch: 54, Global Step: 308, Step Time: 5.6722, Loss: 41.5197
  509. Epoch: 2, Batch: 55, Global Step: 309, Step Time: 4.0301, Loss: 41.4711
  510. Epoch: 2, Batch: 56, Global Step: 310, Step Time: 3.7618, Loss: 41.3449
  511. Epoch: 2, Batch: 57, Global Step: 311, Step Time: 3.9413, Loss: 41.3919
  512. Epoch: 2, Batch: 58, Global Step: 312, Step Time: 3.5316, Loss: 41.3178
  513. Epoch: 2, Batch: 59, Global Step: 313, Step Time: 2.9582, Loss: 41.2493
  514. Epoch: 2, Batch: 60, Global Step: 314, Step Time: 3.6414, Loss: 41.2125
  515. Epoch: 2, Batch: 61, Global Step: 315, Step Time: 3.9427, Loss: 41.1587
  516. Epoch: 2, Batch: 62, Global Step: 316, Step Time: 4.1657, Loss: 41.0765
  517. Epoch: 2, Batch: 63, Global Step: 317, Step Time: 4.5555, Loss: 41.2035
  518. Epoch: 2, Batch: 64, Global Step: 318, Step Time: 7.0100, Loss: 41.2531
  519. Epoch: 2, Batch: 65, Global Step: 319, Step Time: 4.2438, Loss: 41.1717
  520. Epoch: 2, Batch: 66, Global Step: 320, Step Time: 2.9933, Loss: 41.0699
  521. Epoch: 2, Batch: 67, Global Step: 321, Step Time: 4.3095, Loss: 41.0808
  522. Epoch: 2, Batch: 68, Global Step: 322, Step Time: 4.6810, Loss: 41.0610
  523. Epoch: 2, Batch: 69, Global Step: 323, Step Time: 5.3478, Loss: 41.0103
  524. Epoch: 2, Batch: 70, Global Step: 324, Step Time: 7.6822, Loss: 41.1002
  525. Epoch: 2, Batch: 71, Global Step: 325, Step Time: 3.5995, Loss: 41.0642
  526. Epoch: 2, Batch: 72, Global Step: 326, Step Time: 3.6809, Loss: 40.9970
  527. Epoch: 2, Batch: 73, Global Step: 327, Step Time: 3.5218, Loss: 40.9618
  528. Epoch: 2, Batch: 74, Global Step: 328, Step Time: 4.1893, Loss: 40.9281
  529. Epoch: 2, Batch: 75, Global Step: 329, Step Time: 5.1102, Loss: 40.9420
  530. Epoch: 2, Batch: 76, Global Step: 330, Step Time: 5.8237, Loss: 40.9727
  531. Epoch: 2, Batch: 77, Global Step: 331, Step Time: 5.2777, Loss: 41.0009
  532. Epoch: 2, Batch: 78, Global Step: 332, Step Time: 4.6391, Loss: 40.9314
  533. Epoch: 2, Batch: 79, Global Step: 333, Step Time: 7.8239, Loss: 41.0494
  534. Epoch: 2, Batch: 80, Global Step: 334, Step Time: 5.0702, Loss: 41.0304
  535. Epoch: 2, Batch: 81, Global Step: 335, Step Time: 4.3572, Loss: 41.0333
  536. Epoch: 2, Batch: 82, Global Step: 336, Step Time: 5.7401, Loss: 41.0782
  537. Epoch: 2, Batch: 83, Global Step: 337, Step Time: 4.6148, Loss: 41.0795
  538. Epoch: 2, Batch: 84, Global Step: 338, Step Time: 7.3801, Loss: 41.3165
  539. Epoch: 2, Batch: 85, Global Step: 339, Step Time: 4.0057, Loss: 41.2828
  540. Epoch: 2, Batch: 86, Global Step: 340, Step Time: 5.6832, Loss: 41.4272
  541. Epoch: 2, Batch: 87, Global Step: 341, Step Time: 5.1649, Loss: 41.3652
  542. Epoch: 2, Batch: 88, Global Step: 342, Step Time: 5.3300, Loss: 41.3417
  543. Epoch: 2, Batch: 89, Global Step: 343, Step Time: 5.4672, Loss: 41.3932
  544. Epoch: 2, Batch: 90, Global Step: 344, Step Time: 3.8167, Loss: 41.4028
  545. Epoch: 2, Batch: 91, Global Step: 345, Step Time: 3.7662, Loss: 41.3587
  546. Epoch: 2, Batch: 92, Global Step: 346, Step Time: 5.3053, Loss: 41.3975
  547. Epoch: 2, Batch: 93, Global Step: 347, Step Time: 4.5094, Loss: 41.4333
  548. Epoch: 2, Batch: 94, Global Step: 348, Step Time: 3.1447, Loss: 41.3982
  549. Epoch: 2, Batch: 95, Global Step: 349, Step Time: 4.0669, Loss: 41.3732
  550. Epoch: 2, Batch: 96, Global Step: 350, Step Time: 3.8007, Loss: 41.3646
  551. Epoch: 2, Batch: 97, Global Step: 351, Step Time: 5.4291, Loss: 41.4242
  552. Epoch: 2, Batch: 98, Global Step: 352, Step Time: 3.3652, Loss: 41.3475
  553. Epoch: 2, Batch: 99, Global Step: 353, Step Time: 5.9693, Loss: 41.3577
  554. Epoch: 2, Batch: 100, Global Step: 354, Step Time: 6.0675, Loss: 41.3427
  555. Epoch: 2, Batch: 101, Global Step: 355, Step Time: 3.9528, Loss: 41.3308
  556. Epoch: 2, Batch: 102, Global Step: 356, Step Time: 6.6456, Loss: 41.3281
  557. Epoch: 2, Batch: 103, Global Step: 357, Step Time: 3.6920, Loss: 41.2613
  558. Epoch: 2, Batch: 104, Global Step: 358, Step Time: 3.3239, Loss: 41.2708
  559. Epoch: 2, Batch: 105, Global Step: 359, Step Time: 6.0643, Loss: 41.2522
  560. Epoch: 2, Batch: 106, Global Step: 360, Step Time: 6.2988, Loss: 41.2561
  561. Epoch: 2, Batch: 107, Global Step: 361, Step Time: 4.5236, Loss: 41.2722
  562. Epoch: 2, Batch: 108, Global Step: 362, Step Time: 4.7087, Loss: 41.2411
  563. Epoch: 2, Batch: 109, Global Step: 363, Step Time: 4.8021, Loss: 41.2543
  564. Epoch: 2, Batch: 110, Global Step: 364, Step Time: 3.0913, Loss: 41.2297
  565. Epoch: 2, Batch: 111, Global Step: 365, Step Time: 4.3130, Loss: 41.2224
  566. Epoch: 2, Batch: 112, Global Step: 366, Step Time: 3.4551, Loss: 41.1921
  567. Epoch: 2, Batch: 113, Global Step: 367, Step Time: 5.9321, Loss: 41.2217
  568. Epoch: 2, Batch: 114, Global Step: 368, Step Time: 6.0243, Loss: 41.2793
  569. Epoch: 2, Batch: 115, Global Step: 369, Step Time: 3.5188, Loss: 41.2405
  570. Epoch: 2, Batch: 116, Global Step: 370, Step Time: 3.3373, Loss: 41.2251
  571. Epoch: 2, Batch: 117, Global Step: 371, Step Time: 3.7438, Loss: 41.1938
  572. Epoch: 2, Batch: 118, Global Step: 372, Step Time: 5.0989, Loss: 41.1992
  573. Epoch: 2, Batch: 119, Global Step: 373, Step Time: 6.1196, Loss: 41.2131
  574. Epoch: 2, Batch: 120, Global Step: 374, Step Time: 3.9878, Loss: 41.2119
  575. Epoch: 2, Batch: 121, Global Step: 375, Step Time: 4.4308, Loss: 41.1445
  576. Epoch: 2, Batch: 122, Global Step: 376, Step Time: 5.2752, Loss: 41.1480
  577. Epoch: 2, Batch: 123, Global Step: 377, Step Time: 6.0991, Loss: 41.2984
  578. Epoch: 2, Batch: 124, Global Step: 378, Step Time: 4.5530, Loss: 41.2673
  579. Epoch: 2, Batch: 125, Global Step: 379, Step Time: 4.0597, Loss: 41.2732
  580. Epoch: 2, Batch: 126, Global Step: 380, Step Time: 3.7878, Loss: 41.2008
  581. EPOCH RESULTS: Loss: 41.2008
  582. Epoch: 3, Batch: 0, Global Step: 381, Step Time: 4.4653, Loss: 39.3902
  583. Epoch: 3, Batch: 1, Global Step: 382, Step Time: 3.3415, Loss: 38.6367
  584. Epoch: 3, Batch: 2, Global Step: 383, Step Time: 3.4099, Loss: 36.8730
  585. Epoch: 3, Batch: 3, Global Step: 384, Step Time: 3.4408, Loss: 36.8442
  586. Epoch: 3, Batch: 4, Global Step: 385, Step Time: 3.3244, Loss: 36.7540
  587. Epoch: 3, Batch: 5, Global Step: 386, Step Time: 4.7720, Loss: 37.1077
  588. Epoch: 3, Batch: 6, Global Step: 387, Step Time: 3.2880, Loss: 36.5537
  589. Epoch: 3, Batch: 7, Global Step: 388, Step Time: 3.0916, Loss: 36.7459
  590. Epoch: 3, Batch: 8, Global Step: 389, Step Time: 4.0424, Loss: 36.9612
  591. Epoch: 3, Batch: 9, Global Step: 390, Step Time: 3.9976, Loss: 37.5569
  592. Epoch: 3, Batch: 10, Global Step: 391, Step Time: 4.6024, Loss: 38.3726
  593. Epoch: 3, Batch: 11, Global Step: 392, Step Time: 4.8290, Loss: 38.8670
  594. Epoch: 3, Batch: 12, Global Step: 393, Step Time: 5.7244, Loss: 38.7801
  595. Epoch: 3, Batch: 13, Global Step: 394, Step Time: 3.7788, Loss: 38.7707
  596. Epoch: 3, Batch: 14, Global Step: 395, Step Time: 4.3552, Loss: 38.7597
  597. Epoch: 3, Batch: 15, Global Step: 396, Step Time: 4.8720, Loss: 39.0298
  598. Epoch: 3, Batch: 16, Global Step: 397, Step Time: 5.3412, Loss: 39.0862
  599. Epoch: 3, Batch: 17, Global Step: 398, Step Time: 5.1959, Loss: 39.4664
  600. Epoch: 3, Batch: 18, Global Step: 399, Step Time: 4.3947, Loss: 39.2818
  601. Epoch: 3, Batch: 19, Global Step: 400, Step Time: 4.4843, Loss: 39.5555
  602. Epoch: 3, Batch: 20, Global Step: 401, Step Time: 7.1046, Loss: 39.3993
  603. Epoch: 3, Batch: 21, Global Step: 402, Step Time: 4.5561, Loss: 39.2551
  604. Epoch: 3, Batch: 22, Global Step: 403, Step Time: 5.6046, Loss: 39.2731
  605. Epoch: 3, Batch: 23, Global Step: 404, Step Time: 3.9587, Loss: 38.9786
  606. Epoch: 3, Batch: 24, Global Step: 405, Step Time: 4.4934, Loss: 39.9934
  607. Epoch: 3, Batch: 25, Global Step: 406, Step Time: 4.0080, Loss: 40.0834
  608. Epoch: 3, Batch: 26, Global Step: 407, Step Time: 4.0543, Loss: 40.1868
  609. Epoch: 3, Batch: 27, Global Step: 408, Step Time: 3.3677, Loss: 40.0916
  610. Epoch: 3, Batch: 28, Global Step: 409, Step Time: 3.4211, Loss: 39.9842
  611. Epoch: 3, Batch: 29, Global Step: 410, Step Time: 3.9487, Loss: 39.9625
  612. Epoch: 3, Batch: 30, Global Step: 411, Step Time: 4.2128, Loss: 39.9529
  613. Epoch: 3, Batch: 31, Global Step: 412, Step Time: 3.2328, Loss: 39.8539
  614. Epoch: 3, Batch: 32, Global Step: 413, Step Time: 3.8767, Loss: 39.8306
  615. Epoch: 3, Batch: 33, Global Step: 414, Step Time: 4.0363, Loss: 39.7481
  616. Epoch: 3, Batch: 34, Global Step: 415, Step Time: 4.0685, Loss: 39.7596
  617. Epoch: 3, Batch: 35, Global Step: 416, Step Time: 3.5523, Loss: 39.8918
  618. Epoch: 3, Batch: 36, Global Step: 417, Step Time: 3.8046, Loss: 39.8811
  619. Epoch: 3, Batch: 37, Global Step: 418, Step Time: 5.0025, Loss: 39.9673
  620. Epoch: 3, Batch: 38, Global Step: 419, Step Time: 5.3612, Loss: 40.2012
  621. Epoch: 3, Batch: 39, Global Step: 420, Step Time: 4.5088, Loss: 40.2552
  622. Epoch: 3, Batch: 40, Global Step: 421, Step Time: 2.8000, Loss: 40.2233
  623. Epoch: 3, Batch: 41, Global Step: 422, Step Time: 4.2694, Loss: 40.6024
  624. Epoch: 3, Batch: 42, Global Step: 423, Step Time: 4.2090, Loss: 40.6272
  625. Epoch: 3, Batch: 43, Global Step: 424, Step Time: 5.0784, Loss: 40.7707
  626. Epoch: 3, Batch: 44, Global Step: 425, Step Time: 5.5291, Loss: 40.8466
  627. Epoch: 3, Batch: 45, Global Step: 426, Step Time: 7.1229, Loss: 40.9901
  628. Epoch: 3, Batch: 46, Global Step: 427, Step Time: 4.8091, Loss: 40.8896
  629. Epoch: 3, Batch: 47, Global Step: 428, Step Time: 5.2222, Loss: 40.8892
  630. Epoch: 3, Batch: 48, Global Step: 429, Step Time: 4.1766, Loss: 40.9634
  631. Epoch: 3, Batch: 49, Global Step: 430, Step Time: 3.4245, Loss: 40.8671
  632. Epoch: 3, Batch: 50, Global Step: 431, Step Time: 5.4247, Loss: 40.8428
  633. Epoch: 3, Batch: 51, Global Step: 432, Step Time: 7.0072, Loss: 41.0375
  634. Epoch: 3, Batch: 52, Global Step: 433, Step Time: 5.8636, Loss: 41.1360
  635. Epoch: 3, Batch: 53, Global Step: 434, Step Time: 6.7931, Loss: 41.1323
  636. Epoch: 3, Batch: 54, Global Step: 435, Step Time: 5.8230, Loss: 41.0606
  637. Epoch: 3, Batch: 55, Global Step: 436, Step Time: 3.9484, Loss: 41.0080
  638. Epoch: 3, Batch: 56, Global Step: 437, Step Time: 3.6862, Loss: 40.8862
  639. Epoch: 3, Batch: 57, Global Step: 438, Step Time: 4.0293, Loss: 40.9309
  640. Epoch: 3, Batch: 58, Global Step: 439, Step Time: 3.6587, Loss: 40.8591
  641. Epoch: 3, Batch: 59, Global Step: 440, Step Time: 3.0570, Loss: 40.7906
  642. Epoch: 3, Batch: 60, Global Step: 441, Step Time: 3.7774, Loss: 40.7546
  643. Epoch: 3, Batch: 61, Global Step: 442, Step Time: 4.0834, Loss: 40.7041
  644. Epoch: 3, Batch: 62, Global Step: 443, Step Time: 4.1510, Loss: 40.6227
  645. Epoch: 3, Batch: 63, Global Step: 444, Step Time: 4.4490, Loss: 40.7505
  646. Epoch: 3, Batch: 64, Global Step: 445, Step Time: 7.1891, Loss: 40.8022
  647. Epoch: 3, Batch: 65, Global Step: 446, Step Time: 4.3835, Loss: 40.7237
  648. Epoch: 3, Batch: 66, Global Step: 447, Step Time: 3.0571, Loss: 40.6236
  649. Epoch: 3, Batch: 67, Global Step: 448, Step Time: 4.3249, Loss: 40.6350
  650. Epoch: 3, Batch: 68, Global Step: 449, Step Time: 4.8336, Loss: 40.6165
  651. Epoch: 3, Batch: 69, Global Step: 450, Step Time: 5.5771, Loss: 40.5662
  652. Epoch: 3, Batch: 70, Global Step: 451, Step Time: 8.1294, Loss: 40.6580
  653. Epoch: 3, Batch: 71, Global Step: 452, Step Time: 3.7150, Loss: 40.6211
  654. Epoch: 3, Batch: 72, Global Step: 453, Step Time: 3.8494, Loss: 40.5543
  655. Epoch: 3, Batch: 73, Global Step: 454, Step Time: 3.6925, Loss: 40.5195
  656. Epoch: 3, Batch: 74, Global Step: 455, Step Time: 4.3829, Loss: 40.4866
  657. Epoch: 3, Batch: 75, Global Step: 456, Step Time: 5.1236, Loss: 40.4986
  658. Epoch: 3, Batch: 76, Global Step: 457, Step Time: 6.0057, Loss: 40.5303
  659. Epoch: 3, Batch: 77, Global Step: 458, Step Time: 5.4515, Loss: 40.5587
  660. Epoch: 3, Batch: 78, Global Step: 459, Step Time: 4.7528, Loss: 40.4919
  661. Epoch: 3, Batch: 79, Global Step: 460, Step Time: 8.1887, Loss: 40.6097
  662. Epoch: 3, Batch: 80, Global Step: 461, Step Time: 5.1943, Loss: 40.5896
  663. Epoch: 3, Batch: 81, Global Step: 462, Step Time: 4.3864, Loss: 40.5920
  664. Epoch: 3, Batch: 82, Global Step: 463, Step Time: 5.6021, Loss: 40.6378
  665. Epoch: 3, Batch: 83, Global Step: 464, Step Time: 4.4776, Loss: 40.6386
  666. Epoch: 3, Batch: 84, Global Step: 465, Step Time: 7.5677, Loss: 40.8759
  667. Epoch: 3, Batch: 85, Global Step: 466, Step Time: 4.2053, Loss: 40.8447
  668. Epoch: 3, Batch: 86, Global Step: 467, Step Time: 5.9381, Loss: 40.9884
  669. Epoch: 3, Batch: 87, Global Step: 468, Step Time: 5.3676, Loss: 40.9278
  670. Epoch: 3, Batch: 88, Global Step: 469, Step Time: 5.5453, Loss: 40.9050
  671. Epoch: 3, Batch: 89, Global Step: 470, Step Time: 5.6899, Loss: 40.9571
  672. Epoch: 3, Batch: 90, Global Step: 471, Step Time: 3.9734, Loss: 40.9674
  673. Epoch: 3, Batch: 91, Global Step: 472, Step Time: 3.7486, Loss: 40.9224
  674. Epoch: 3, Batch: 92, Global Step: 473, Step Time: 5.4860, Loss: 40.9628
  675. Epoch: 3, Batch: 93, Global Step: 474, Step Time: 4.6673, Loss: 41.0000
  676. Epoch: 3, Batch: 94, Global Step: 475, Step Time: 3.3006, Loss: 40.9648
  677. Epoch: 3, Batch: 95, Global Step: 476, Step Time: 4.0935, Loss: 40.9412
  678. Epoch: 3, Batch: 96, Global Step: 477, Step Time: 3.7583, Loss: 40.9338
  679. Epoch: 3, Batch: 97, Global Step: 478, Step Time: 5.5984, Loss: 40.9950
  680. Epoch: 3, Batch: 98, Global Step: 479, Step Time: 3.3683, Loss: 40.9204
  681. Epoch: 3, Batch: 99, Global Step: 480, Step Time: 6.1148, Loss: 40.9304
  682. Epoch: 3, Batch: 100, Global Step: 481, Step Time: 6.2695, Loss: 40.9160
  683. Epoch: 3, Batch: 101, Global Step: 482, Step Time: 3.9469, Loss: 40.9030
  684. Epoch: 3, Batch: 102, Global Step: 483, Step Time: 6.5052, Loss: 40.9011
  685. Epoch: 3, Batch: 103, Global Step: 484, Step Time: 3.7842, Loss: 40.8358
  686. Epoch: 3, Batch: 104, Global Step: 485, Step Time: 3.3073, Loss: 40.8458
  687. Epoch: 3, Batch: 105, Global Step: 486, Step Time: 6.0331, Loss: 40.8285
  688. Epoch: 3, Batch: 106, Global Step: 487, Step Time: 6.2645, Loss: 40.8339
  689. Epoch: 3, Batch: 107, Global Step: 488, Step Time: 4.4007, Loss: 40.8512
  690. Epoch: 3, Batch: 108, Global Step: 489, Step Time: 4.6935, Loss: 40.8206
  691. Epoch: 3, Batch: 109, Global Step: 490, Step Time: 4.7305, Loss: 40.8329
  692. Epoch: 3, Batch: 110, Global Step: 491, Step Time: 3.1571, Loss: 40.8091
  693. Epoch: 3, Batch: 111, Global Step: 492, Step Time: 4.4828, Loss: 40.8027
  694. Epoch: 3, Batch: 112, Global Step: 493, Step Time: 3.5989, Loss: 40.7734
  695. Epoch: 3, Batch: 113, Global Step: 494, Step Time: 6.2359, Loss: 40.8034
  696. Epoch: 3, Batch: 114, Global Step: 495, Step Time: 6.2920, Loss: 40.8621
  697. Epoch: 3, Batch: 115, Global Step: 496, Step Time: 3.6816, Loss: 40.8228
  698. Epoch: 3, Batch: 116, Global Step: 497, Step Time: 3.5079, Loss: 40.8082
  699. Epoch: 3, Batch: 117, Global Step: 498, Step Time: 3.7194, Loss: 40.7774
  700. Epoch: 3, Batch: 118, Global Step: 499, Step Time: 5.0434, Loss: 40.7816
  701. Epoch: 3, Batch: 119, Global Step: 500, Step Time: 6.0545, Loss: 40.7949
  702. Epoch: 3, Batch: 120, Global Step: 501, Step Time: 4.1683, Loss: 40.7948
  703. Epoch: 3, Batch: 121, Global Step: 502, Step Time: 4.4666, Loss: 40.7283
  704. Epoch: 3, Batch: 122, Global Step: 503, Step Time: 5.2393, Loss: 40.7329
  705. Epoch: 3, Batch: 123, Global Step: 504, Step Time: 6.0687, Loss: 40.8844
  706. Epoch: 3, Batch: 124, Global Step: 505, Step Time: 4.5529, Loss: 40.8526
  707. Epoch: 3, Batch: 125, Global Step: 506, Step Time: 3.9933, Loss: 40.8582
  708. Epoch: 3, Batch: 126, Global Step: 507, Step Time: 3.7523, Loss: 40.7867
  709. EPOCH RESULTS: Loss: 40.7867
  710. Epoch: 4, Batch: 0, Global Step: 508, Step Time: 4.6945, Loss: 39.3347
  711. Epoch: 4, Batch: 1, Global Step: 509, Step Time: 3.2919, Loss: 38.2704
  712. Epoch: 4, Batch: 2, Global Step: 510, Step Time: 3.3792, Loss: 36.4792
  713. Epoch: 4, Batch: 3, Global Step: 511, Step Time: 3.3930, Loss: 36.4652
  714. Epoch: 4, Batch: 4, Global Step: 512, Step Time: 3.4752, Loss: 36.3953
  715. Epoch: 4, Batch: 5, Global Step: 513, Step Time: 4.8011, Loss: 36.7786
  716. Epoch: 4, Batch: 6, Global Step: 514, Step Time: 3.4422, Loss: 36.2172
  717. Epoch: 4, Batch: 7, Global Step: 515, Step Time: 3.2879, Loss: 36.3910
  718. Epoch: 4, Batch: 8, Global Step: 516, Step Time: 4.2492, Loss: 36.6179
  719. Epoch: 4, Batch: 9, Global Step: 517, Step Time: 4.1665, Loss: 37.2365
  720. Epoch: 4, Batch: 10, Global Step: 518, Step Time: 4.8060, Loss: 38.0573
  721. Epoch: 4, Batch: 11, Global Step: 519, Step Time: 4.9946, Loss: 38.5460
  722. Epoch: 4, Batch: 12, Global Step: 520, Step Time: 5.7167, Loss: 38.4700
  723. Epoch: 4, Batch: 13, Global Step: 521, Step Time: 3.6798, Loss: 38.4499
  724. Epoch: 4, Batch: 14, Global Step: 522, Step Time: 4.4752, Loss: 38.4444
  725. Epoch: 4, Batch: 15, Global Step: 523, Step Time: 5.0038, Loss: 38.7144
  726. Epoch: 4, Batch: 16, Global Step: 524, Step Time: 5.5255, Loss: 38.7631
  727. Epoch: 4, Batch: 17, Global Step: 525, Step Time: 5.3819, Loss: 39.1387
  728. Epoch: 4, Batch: 18, Global Step: 526, Step Time: 4.5187, Loss: 38.9537
  729. Epoch: 4, Batch: 19, Global Step: 527, Step Time: 4.4579, Loss: 39.2234
  730. Epoch: 4, Batch: 20, Global Step: 528, Step Time: 6.9045, Loss: 39.0755
  731. Epoch: 4, Batch: 21, Global Step: 529, Step Time: 4.7414, Loss: 38.9342
  732. Epoch: 4, Batch: 22, Global Step: 530, Step Time: 5.8800, Loss: 38.9442
  733. Epoch: 4, Batch: 23, Global Step: 531, Step Time: 4.0209, Loss: 38.6537
  734. Epoch: 4, Batch: 24, Global Step: 532, Step Time: 4.5596, Loss: 39.6711
  735. Epoch: 4, Batch: 25, Global Step: 533, Step Time: 4.0366, Loss: 39.7669
  736. Epoch: 4, Batch: 26, Global Step: 534, Step Time: 4.2482, Loss: 39.8627
  737. Epoch: 4, Batch: 27, Global Step: 535, Step Time: 3.4252, Loss: 39.7549
  738. Epoch: 4, Batch: 28, Global Step: 536, Step Time: 3.5296, Loss: 39.6543
  739. Epoch: 4, Batch: 29, Global Step: 537, Step Time: 3.9224, Loss: 39.6403
  740. Epoch: 4, Batch: 30, Global Step: 538, Step Time: 4.2836, Loss: 39.6362
  741. Epoch: 4, Batch: 31, Global Step: 539, Step Time: 3.2784, Loss: 39.5304
  742. Epoch: 4, Batch: 32, Global Step: 540, Step Time: 4.0151, Loss: 39.5012
  743. Epoch: 4, Batch: 33, Global Step: 541, Step Time: 3.9914, Loss: 39.4198
  744. Epoch: 4, Batch: 34, Global Step: 542, Step Time: 4.0776, Loss: 39.4347
  745. Epoch: 4, Batch: 35, Global Step: 543, Step Time: 3.5370, Loss: 39.5751
  746. Epoch: 4, Batch: 36, Global Step: 544, Step Time: 3.9421, Loss: 39.5667
  747. Epoch: 4, Batch: 37, Global Step: 545, Step Time: 5.0012, Loss: 39.6490
  748. Epoch: 4, Batch: 38, Global Step: 546, Step Time: 5.4343, Loss: 39.8744
  749. Epoch: 4, Batch: 39, Global Step: 547, Step Time: 4.5527, Loss: 39.9376
  750. Epoch: 4, Batch: 40, Global Step: 548, Step Time: 2.8472, Loss: 39.9158
  751. Epoch: 4, Batch: 41, Global Step: 549, Step Time: 4.4099, Loss: 40.2800
  752. Epoch: 4, Batch: 42, Global Step: 550, Step Time: 4.3679, Loss: 40.3022
  753. Epoch: 4, Batch: 43, Global Step: 551, Step Time: 5.2652, Loss: 40.4452
  754. Epoch: 4, Batch: 44, Global Step: 552, Step Time: 5.7846, Loss: 40.5202
  755. Epoch: 4, Batch: 45, Global Step: 553, Step Time: 7.1070, Loss: 40.6644
  756. Epoch: 4, Batch: 46, Global Step: 554, Step Time: 4.9905, Loss: 40.5674
  757. Epoch: 4, Batch: 47, Global Step: 555, Step Time: 5.4778, Loss: 40.5672
  758. Epoch: 4, Batch: 48, Global Step: 556, Step Time: 4.1951, Loss: 40.6402
  759. Epoch: 4, Batch: 49, Global Step: 557, Step Time: 3.5624, Loss: 40.5426
  760. Epoch: 4, Batch: 50, Global Step: 558, Step Time: 5.6276, Loss: 40.5176
  761. Epoch: 4, Batch: 51, Global Step: 559, Step Time: 7.2970, Loss: 40.7126
  762. Epoch: 4, Batch: 52, Global Step: 560, Step Time: 6.1194, Loss: 40.8131
  763. Epoch: 4, Batch: 53, Global Step: 561, Step Time: 6.7657, Loss: 40.8091
  764. Epoch: 4, Batch: 54, Global Step: 562, Step Time: 5.7007, Loss: 40.7380
  765. Epoch: 4, Batch: 55, Global Step: 563, Step Time: 4.0386, Loss: 40.6835
  766. Epoch: 4, Batch: 56, Global Step: 564, Step Time: 3.8458, Loss: 40.5629
  767. Epoch: 4, Batch: 57, Global Step: 565, Step Time: 4.0288, Loss: 40.6056
  768. Epoch: 4, Batch: 58, Global Step: 566, Step Time: 3.6421, Loss: 40.5363
  769. Epoch: 4, Batch: 59, Global Step: 567, Step Time: 3.0216, Loss: 40.4695
  770. Epoch: 4, Batch: 60, Global Step: 568, Step Time: 3.7197, Loss: 40.4346
  771. Epoch: 4, Batch: 61, Global Step: 569, Step Time: 4.0198, Loss: 40.3867
  772. Epoch: 4, Batch: 62, Global Step: 570, Step Time: 4.2515, Loss: 40.3054
  773. Epoch: 4, Batch: 63, Global Step: 571, Step Time: 4.6415, Loss: 40.4303
  774. Epoch: 4, Batch: 64, Global Step: 572, Step Time: 7.2387, Loss: 40.4825
  775. Epoch: 4, Batch: 65, Global Step: 573, Step Time: 4.2974, Loss: 40.4075
  776. Epoch: 4, Batch: 66, Global Step: 574, Step Time: 3.0626, Loss: 40.3084
  777. Epoch: 4, Batch: 67, Global Step: 575, Step Time: 4.3321, Loss: 40.3202
  778. Epoch: 4, Batch: 68, Global Step: 576, Step Time: 4.8531, Loss: 40.3033
  779. Epoch: 4, Batch: 69, Global Step: 577, Step Time: 5.4512, Loss: 40.2538
  780. Epoch: 4, Batch: 70, Global Step: 578, Step Time: 7.8554, Loss: 40.3471
  781. Epoch: 4, Batch: 71, Global Step: 579, Step Time: 3.6831, Loss: 40.3090
  782. Epoch: 4, Batch: 72, Global Step: 580, Step Time: 3.6881, Loss: 40.2437
  783. Epoch: 4, Batch: 73, Global Step: 581, Step Time: 3.6078, Loss: 40.2090
  784. Epoch: 4, Batch: 74, Global Step: 582, Step Time: 4.3581, Loss: 40.1767
  785. Epoch: 4, Batch: 75, Global Step: 583, Step Time: 5.2119, Loss: 40.1874
  786. Epoch: 4, Batch: 76, Global Step: 584, Step Time: 5.9491, Loss: 40.2191
  787. Epoch: 4, Batch: 77, Global Step: 585, Step Time: 5.4209, Loss: 40.2481
  788. Epoch: 4, Batch: 78, Global Step: 586, Step Time: 4.7501, Loss: 40.1824
  789. Epoch: 4, Batch: 79, Global Step: 587, Step Time: 7.9794, Loss: 40.2996
  790. Epoch: 4, Batch: 80, Global Step: 588, Step Time: 5.0695, Loss: 40.2784
  791. Epoch: 4, Batch: 81, Global Step: 589, Step Time: 4.3918, Loss: 40.2806
  792. Epoch: 4, Batch: 82, Global Step: 590, Step Time: 5.7804, Loss: 40.3262
  793. Epoch: 4, Batch: 83, Global Step: 591, Step Time: 4.6972, Loss: 40.3263
  794. Epoch: 4, Batch: 84, Global Step: 592, Step Time: 7.5264, Loss: 40.5605
  795. Epoch: 4, Batch: 85, Global Step: 593, Step Time: 4.1923, Loss: 40.5310
  796. Epoch: 4, Batch: 86, Global Step: 594, Step Time: 5.7295, Loss: 40.6744
  797. Epoch: 4, Batch: 87, Global Step: 595, Step Time: 5.2320, Loss: 40.6147
  798. Epoch: 4, Batch: 88, Global Step: 596, Step Time: 5.4313, Loss: 40.5932
  799. Epoch: 4, Batch: 89, Global Step: 597, Step Time: 5.5692, Loss: 40.6455
  800. Epoch: 4, Batch: 90, Global Step: 598, Step Time: 3.9225, Loss: 40.6562
  801. Epoch: 4, Batch: 91, Global Step: 599, Step Time: 3.8354, Loss: 40.6103
  802. Epoch: 4, Batch: 92, Global Step: 600, Step Time: 5.4306, Loss: 40.6519
  803. Epoch: 4, Batch: 93, Global Step: 601, Step Time: 4.6369, Loss: 40.6904
  804. Epoch: 4, Batch: 94, Global Step: 602, Step Time: 3.2261, Loss: 40.6547
  805. Epoch: 4, Batch: 95, Global Step: 603, Step Time: 4.1552, Loss: 40.6315
  806. Epoch: 4, Batch: 96, Global Step: 604, Step Time: 3.9397, Loss: 40.6250
  807. Epoch: 4, Batch: 97, Global Step: 605, Step Time: 5.6728, Loss: 40.6872
  808. Epoch: 4, Batch: 98, Global Step: 606, Step Time: 3.5423, Loss: 40.6143
  809. Epoch: 4, Batch: 99, Global Step: 607, Step Time: 6.0517, Loss: 40.6242
  810. Epoch: 4, Batch: 100, Global Step: 608, Step Time: 6.0865, Loss: 40.6100
  811. Epoch: 4, Batch: 101, Global Step: 609, Step Time: 4.0632, Loss: 40.5962
  812. Epoch: 4, Batch: 102, Global Step: 610, Step Time: 6.7907, Loss: 40.5952
  813. Epoch: 4, Batch: 103, Global Step: 611, Step Time: 3.7193, Loss: 40.5312
  814. Epoch: 4, Batch: 104, Global Step: 612, Step Time: 3.4022, Loss: 40.5423
  815. Epoch: 4, Batch: 105, Global Step: 613, Step Time: 6.1923, Loss: 40.5261
  816. Epoch: 4, Batch: 106, Global Step: 614, Step Time: 6.4254, Loss: 40.5323
  817. Epoch: 4, Batch: 107, Global Step: 615, Step Time: 4.5694, Loss: 40.5508
  818. Epoch: 4, Batch: 108, Global Step: 616, Step Time: 4.8162, Loss: 40.5212
  819. Epoch: 4, Batch: 109, Global Step: 617, Step Time: 4.9086, Loss: 40.5330
  820. Epoch: 4, Batch: 110, Global Step: 618, Step Time: 3.1865, Loss: 40.5094
  821. Epoch: 4, Batch: 111, Global Step: 619, Step Time: 4.4120, Loss: 40.5036
  822. Epoch: 4, Batch: 112, Global Step: 620, Step Time: 3.5717, Loss: 40.4751
  823. Epoch: 4, Batch: 113, Global Step: 621, Step Time: 6.0481, Loss: 40.5060
  824. Epoch: 4, Batch: 114, Global Step: 622, Step Time: 6.0876, Loss: 40.5661
  825. Epoch: 4, Batch: 115, Global Step: 623, Step Time: 3.5687, Loss: 40.5256
  826. Epoch: 4, Batch: 116, Global Step: 624, Step Time: 3.4468, Loss: 40.5111
  827. Epoch: 4, Batch: 117, Global Step: 625, Step Time: 3.8451, Loss: 40.4814
  828. Epoch: 4, Batch: 118, Global Step: 626, Step Time: 5.1796, Loss: 40.4850
  829. Epoch: 4, Batch: 119, Global Step: 627, Step Time: 6.2548, Loss: 40.4972
  830. Epoch: 4, Batch: 120, Global Step: 628, Step Time: 4.0511, Loss: 40.4976
  831. Epoch: 4, Batch: 121, Global Step: 629, Step Time: 4.5272, Loss: 40.4316
  832. Epoch: 4, Batch: 122, Global Step: 630, Step Time: 5.4158, Loss: 40.4372
  833. Epoch: 4, Batch: 123, Global Step: 631, Step Time: 6.3174, Loss: 40.5897
  834. Epoch: 4, Batch: 124, Global Step: 632, Step Time: 4.7191, Loss: 40.5565
  835. Epoch: 4, Batch: 125, Global Step: 633, Step Time: 4.1113, Loss: 40.5610
  836. Epoch: 4, Batch: 126, Global Step: 634, Step Time: 3.8520, Loss: 40.4895
  837. EPOCH RESULTS: Loss: 40.4895
  838. Epoch: 5, Batch: 0, Global Step: 635, Step Time: 4.5296, Loss: 39.4557
  839. Epoch: 5, Batch: 1, Global Step: 636, Step Time: 3.3489, Loss: 38.0812
  840. Epoch: 5, Batch: 2, Global Step: 637, Step Time: 3.4735, Loss: 36.2207
  841. Epoch: 5, Batch: 3, Global Step: 638, Step Time: 3.4867, Loss: 36.1746
  842. Epoch: 5, Batch: 4, Global Step: 639, Step Time: 3.4556, Loss: 36.1053
  843. Epoch: 5, Batch: 5, Global Step: 640, Step Time: 4.8442, Loss: 36.5097
  844. Epoch: 5, Batch: 6, Global Step: 641, Step Time: 3.3300, Loss: 35.9586
  845. Epoch: 5, Batch: 7, Global Step: 642, Step Time: 3.1790, Loss: 36.1227
  846. Epoch: 5, Batch: 8, Global Step: 643, Step Time: 4.0435, Loss: 36.3463
  847. Epoch: 5, Batch: 9, Global Step: 644, Step Time: 4.0518, Loss: 36.9728
  848. Epoch: 5, Batch: 10, Global Step: 645, Step Time: 4.5953, Loss: 37.8046
  849. Epoch: 5, Batch: 11, Global Step: 646, Step Time: 4.7922, Loss: 38.2984
  850. Epoch: 5, Batch: 12, Global Step: 647, Step Time: 5.8151, Loss: 38.2349
  851. Epoch: 5, Batch: 13, Global Step: 648, Step Time: 3.7665, Loss: 38.2007
  852. Epoch: 5, Batch: 14, Global Step: 649, Step Time: 4.4099, Loss: 38.1959
  853. Epoch: 5, Batch: 15, Global Step: 650, Step Time: 4.8700, Loss: 38.4704
  854. Epoch: 5, Batch: 16, Global Step: 651, Step Time: 5.3930, Loss: 38.5168
  855. Epoch: 5, Batch: 17, Global Step: 652, Step Time: 5.2601, Loss: 38.8948
  856. Epoch: 5, Batch: 18, Global Step: 653, Step Time: 4.4492, Loss: 38.7024
  857. Epoch: 5, Batch: 19, Global Step: 654, Step Time: 4.4914, Loss: 38.9672
  858. Epoch: 5, Batch: 20, Global Step: 655, Step Time: 7.1655, Loss: 38.8229
  859. Epoch: 5, Batch: 21, Global Step: 656, Step Time: 4.7056, Loss: 38.6873
  860. Epoch: 5, Batch: 22, Global Step: 657, Step Time: 5.6717, Loss: 38.6928
  861. Epoch: 5, Batch: 23, Global Step: 658, Step Time: 4.0191, Loss: 38.4057
  862. Epoch: 5, Batch: 24, Global Step: 659, Step Time: 4.5023, Loss: 39.4059
  863. Epoch: 5, Batch: 25, Global Step: 660, Step Time: 4.1057, Loss: 39.5036
  864. Epoch: 5, Batch: 26, Global Step: 661, Step Time: 4.1929, Loss: 39.5988
  865. Epoch: 5, Batch: 27, Global Step: 662, Step Time: 3.3957, Loss: 39.4806
  866. Epoch: 5, Batch: 28, Global Step: 663, Step Time: 3.4259, Loss: 39.3775
  867. Epoch: 5, Batch: 29, Global Step: 664, Step Time: 3.9314, Loss: 39.3645
  868. Epoch: 5, Batch: 30, Global Step: 665, Step Time: 4.2014, Loss: 39.3656
  869. Epoch: 5, Batch: 31, Global Step: 666, Step Time: 3.2602, Loss: 39.2586
  870. Epoch: 5, Batch: 32, Global Step: 667, Step Time: 3.9278, Loss: 39.2247
  871. Epoch: 5, Batch: 33, Global Step: 668, Step Time: 4.0045, Loss: 39.1408
  872. Epoch: 5, Batch: 34, Global Step: 669, Step Time: 4.0399, Loss: 39.1509
  873. Epoch: 5, Batch: 35, Global Step: 670, Step Time: 3.4517, Loss: 39.2948
  874. Epoch: 5, Batch: 36, Global Step: 671, Step Time: 3.7758, Loss: 39.2902
  875. Epoch: 5, Batch: 37, Global Step: 672, Step Time: 4.9892, Loss: 39.3738
  876. Epoch: 5, Batch: 38, Global Step: 673, Step Time: 5.4311, Loss: 39.5912
  877. Epoch: 5, Batch: 39, Global Step: 674, Step Time: 4.6011, Loss: 39.6528
  878. Epoch: 5, Batch: 40, Global Step: 675, Step Time: 2.8838, Loss: 39.6369
  879. Epoch: 5, Batch: 41, Global Step: 676, Step Time: 4.3937, Loss: 39.9939
  880. Epoch: 5, Batch: 42, Global Step: 677, Step Time: 4.1995, Loss: 40.0171
  881. Epoch: 5, Batch: 43, Global Step: 678, Step Time: 5.0622, Loss: 40.1589
  882. Epoch: 5, Batch: 44, Global Step: 679, Step Time: 5.5027, Loss: 40.2354
  883. Epoch: 5, Batch: 45, Global Step: 680, Step Time: 7.1283, Loss: 40.3825
  884. Epoch: 5, Batch: 46, Global Step: 681, Step Time: 4.8539, Loss: 40.2843
  885. Epoch: 5, Batch: 47, Global Step: 682, Step Time: 5.1772, Loss: 40.2848
  886. Epoch: 5, Batch: 48, Global Step: 683, Step Time: 4.2845, Loss: 40.3569
  887. Epoch: 5, Batch: 49, Global Step: 684, Step Time: 3.5046, Loss: 40.2635
  888. Epoch: 5, Batch: 50, Global Step: 685, Step Time: 5.4453, Loss: 40.2376
  889. Epoch: 5, Batch: 51, Global Step: 686, Step Time: 7.0326, Loss: 40.4313
  890. Epoch: 5, Batch: 52, Global Step: 687, Step Time: 5.8734, Loss: 40.5313
  891. Epoch: 5, Batch: 53, Global Step: 688, Step Time: 6.8393, Loss: 40.5272
  892. Epoch: 5, Batch: 54, Global Step: 689, Step Time: 5.8674, Loss: 40.4569
  893. Epoch: 5, Batch: 55, Global Step: 690, Step Time: 3.9595, Loss: 40.4009
  894. Epoch: 5, Batch: 56, Global Step: 691, Step Time: 3.7506, Loss: 40.2801
  895. Epoch: 5, Batch: 57, Global Step: 692, Step Time: 4.0063, Loss: 40.3236
  896. Epoch: 5, Batch: 58, Global Step: 693, Step Time: 3.6675, Loss: 40.2559
  897. Epoch: 5, Batch: 59, Global Step: 694, Step Time: 3.0469, Loss: 40.1900
  898. Epoch: 5, Batch: 60, Global Step: 695, Step Time: 3.8538, Loss: 40.1551
  899. Epoch: 5, Batch: 61, Global Step: 696, Step Time: 4.1357, Loss: 40.1099
  900. Epoch: 5, Batch: 62, Global Step: 697, Step Time: 4.1806, Loss: 40.0296
  901. Epoch: 5, Batch: 63, Global Step: 698, Step Time: 4.4495, Loss: 40.1542
  902. Epoch: 5, Batch: 64, Global Step: 699, Step Time: 7.2466, Loss: 40.2070
  903. Epoch: 5, Batch: 65, Global Step: 700, Step Time: 4.4216, Loss: 40.1340
  904. Epoch: 5, Batch: 66, Global Step: 701, Step Time: 3.1489, Loss: 40.0357
  905. Epoch: 5, Batch: 67, Global Step: 702, Step Time: 4.2800, Loss: 40.0487
  906. Epoch: 5, Batch: 68, Global Step: 703, Step Time: 4.9214, Loss: 40.0333
  907. Epoch: 5, Batch: 69, Global Step: 704, Step Time: 5.6402, Loss: 39.9854
  908. Epoch: 5, Batch: 70, Global Step: 705, Step Time: 8.1849, Loss: 40.0797
  909. Epoch: 5, Batch: 71, Global Step: 706, Step Time: 3.7536, Loss: 40.0401
  910. Epoch: 5, Batch: 72, Global Step: 707, Step Time: 3.8384, Loss: 39.9770
  911. Epoch: 5, Batch: 73, Global Step: 708, Step Time: 3.7289, Loss: 39.9421
  912. Epoch: 5, Batch: 74, Global Step: 709, Step Time: 4.4986, Loss: 39.9112
  913. Epoch: 5, Batch: 75, Global Step: 710, Step Time: 5.1233, Loss: 39.9209
  914. Epoch: 5, Batch: 76, Global Step: 711, Step Time: 6.0340, Loss: 39.9532
  915. Epoch: 5, Batch: 77, Global Step: 712, Step Time: 5.5593, Loss: 39.9827
  916. Epoch: 5, Batch: 78, Global Step: 713, Step Time: 4.8978, Loss: 39.9183
  917. Epoch: 5, Batch: 79, Global Step: 714, Step Time: 8.2508, Loss: 40.0369
  918. Epoch: 5, Batch: 80, Global Step: 715, Step Time: 5.2629, Loss: 40.0155
  919. Epoch: 5, Batch: 81, Global Step: 716, Step Time: 4.2960, Loss: 40.0172
  920. Epoch: 5, Batch: 82, Global Step: 717, Step Time: 5.6771, Loss: 40.0642
  921. Epoch: 5, Batch: 83, Global Step: 718, Step Time: 4.5326, Loss: 40.0645
  922. Epoch: 5, Batch: 84, Global Step: 719, Step Time: 7.6526, Loss: 40.2982
  923. Epoch: 5, Batch: 85, Global Step: 720, Step Time: 4.2710, Loss: 40.2691
  924. Epoch: 5, Batch: 86, Global Step: 721, Step Time: 5.9533, Loss: 40.4123
  925. Epoch: 5, Batch: 87, Global Step: 722, Step Time: 5.4142, Loss: 40.3540
  926. Epoch: 5, Batch: 88, Global Step: 723, Step Time: 5.5949, Loss: 40.3335
  927. Epoch: 5, Batch: 89, Global Step: 724, Step Time: 5.6939, Loss: 40.3861
  928. Epoch: 5, Batch: 90, Global Step: 725, Step Time: 4.0418, Loss: 40.3968
  929. Epoch: 5, Batch: 91, Global Step: 726, Step Time: 3.7593, Loss: 40.3497
  930. Epoch: 5, Batch: 92, Global Step: 727, Step Time: 5.4732, Loss: 40.3921
  931. Epoch: 5, Batch: 93, Global Step: 728, Step Time: 4.6951, Loss: 40.4312
  932. Epoch: 5, Batch: 94, Global Step: 729, Step Time: 3.3222, Loss: 40.3954
  933. Epoch: 5, Batch: 95, Global Step: 730, Step Time: 4.1388, Loss: 40.3725
  934. Epoch: 5, Batch: 96, Global Step: 731, Step Time: 3.7842, Loss: 40.3665
  935. Epoch: 5, Batch: 97, Global Step: 732, Step Time: 5.6442, Loss: 40.4296
  936. Epoch: 5, Batch: 98, Global Step: 733, Step Time: 3.3683, Loss: 40.3583
  937. Epoch: 5, Batch: 99, Global Step: 734, Step Time: 6.1097, Loss: 40.3683
  938. Epoch: 5, Batch: 100, Global Step: 735, Step Time: 6.2673, Loss: 40.3546
  939. Epoch: 5, Batch: 101, Global Step: 736, Step Time: 3.9651, Loss: 40.3403
  940. Epoch: 5, Batch: 102, Global Step: 737, Step Time: 6.5327, Loss: 40.3395
  941. Epoch: 5, Batch: 103, Global Step: 738, Step Time: 3.8314, Loss: 40.2771
  942. Epoch: 5, Batch: 104, Global Step: 739, Step Time: 3.3282, Loss: 40.2896
  943. Epoch: 5, Batch: 105, Global Step: 740, Step Time: 6.0329, Loss: 40.2741
  944. Epoch: 5, Batch: 106, Global Step: 741, Step Time: 6.2464, Loss: 40.2804
  945. Epoch: 5, Batch: 107, Global Step: 742, Step Time: 4.4459, Loss: 40.3001
  946. Epoch: 5, Batch: 108, Global Step: 743, Step Time: 4.7050, Loss: 40.2708
  947. Epoch: 5, Batch: 109, Global Step: 744, Step Time: 4.7313, Loss: 40.2823
  948. Epoch: 5, Batch: 110, Global Step: 745, Step Time: 3.1054, Loss: 40.2592
  949. Epoch: 5, Batch: 111, Global Step: 746, Step Time: 4.4474, Loss: 40.2539
  950. Epoch: 5, Batch: 112, Global Step: 747, Step Time: 3.6131, Loss: 40.2260
  951. Epoch: 5, Batch: 113, Global Step: 748, Step Time: 6.2172, Loss: 40.2578
  952. Epoch: 5, Batch: 114, Global Step: 749, Step Time: 6.2961, Loss: 40.3190
  953. Epoch: 5, Batch: 115, Global Step: 750, Step Time: 3.6992, Loss: 40.2780
  954. Epoch: 5, Batch: 116, Global Step: 751, Step Time: 3.5153, Loss: 40.2635
  955. Epoch: 5, Batch: 117, Global Step: 752, Step Time: 3.7803, Loss: 40.2346
  956. Epoch: 5, Batch: 118, Global Step: 753, Step Time: 5.0115, Loss: 40.2379
  957. Epoch: 5, Batch: 119, Global Step: 754, Step Time: 5.9995, Loss: 40.2496
  958. Epoch: 5, Batch: 120, Global Step: 755, Step Time: 4.1478, Loss: 40.2506
  959. Epoch: 5, Batch: 121, Global Step: 756, Step Time: 4.4858, Loss: 40.1848
  960. Epoch: 5, Batch: 122, Global Step: 757, Step Time: 5.1883, Loss: 40.1909
  961. Epoch: 5, Batch: 123, Global Step: 758, Step Time: 6.1060, Loss: 40.3435
  962. Epoch: 5, Batch: 124, Global Step: 759, Step Time: 4.5187, Loss: 40.3098
  963. Epoch: 5, Batch: 125, Global Step: 760, Step Time: 3.9797, Loss: 40.3134
  964. Epoch: 5, Batch: 126, Global Step: 761, Step Time: 3.7327, Loss: 40.2418
  965. EPOCH RESULTS: Loss: 40.2418
  966. Epoch: 6, Batch: 0, Global Step: 762, Step Time: 4.6307, Loss: 39.3392
  967. Epoch: 6, Batch: 1, Global Step: 763, Step Time: 3.2270, Loss: 37.8582
  968. Epoch: 6, Batch: 2, Global Step: 764, Step Time: 3.3736, Loss: 35.9733
  969. Epoch: 6, Batch: 3, Global Step: 765, Step Time: 3.3524, Loss: 35.9034
  970. Epoch: 6, Batch: 4, Global Step: 766, Step Time: 3.4545, Loss: 35.8268
  971. Epoch: 6, Batch: 5, Global Step: 767, Step Time: 4.7198, Loss: 36.2399
  972. Epoch: 6, Batch: 6, Global Step: 768, Step Time: 3.4225, Loss: 35.7114
  973. Epoch: 6, Batch: 7, Global Step: 769, Step Time: 3.1639, Loss: 35.8829
  974. Epoch: 6, Batch: 8, Global Step: 770, Step Time: 4.2266, Loss: 36.1110
  975. Epoch: 6, Batch: 9, Global Step: 771, Step Time: 4.0879, Loss: 36.7366
  976. Epoch: 6, Batch: 10, Global Step: 772, Step Time: 4.7262, Loss: 37.5733
  977. Epoch: 6, Batch: 11, Global Step: 773, Step Time: 4.9599, Loss: 38.0735
  978. Epoch: 6, Batch: 12, Global Step: 774, Step Time: 5.6642, Loss: 38.0280
  979. Epoch: 6, Batch: 13, Global Step: 775, Step Time: 3.6583, Loss: 37.9834
  980. Epoch: 6, Batch: 14, Global Step: 776, Step Time: 4.4548, Loss: 37.9755
  981. Epoch: 6, Batch: 15, Global Step: 777, Step Time: 5.0444, Loss: 38.2516
  982. Epoch: 6, Batch: 16, Global Step: 778, Step Time: 5.5628, Loss: 38.2961
  983. Epoch: 6, Batch: 17, Global Step: 779, Step Time: 5.3939, Loss: 38.6761
  984. Epoch: 6, Batch: 18, Global Step: 780, Step Time: 4.5743, Loss: 38.4800
  985. Epoch: 6, Batch: 19, Global Step: 781, Step Time: 4.4356, Loss: 38.7411
  986. Epoch: 6, Batch: 20, Global Step: 782, Step Time: 6.9495, Loss: 38.5963
  987. Epoch: 6, Batch: 21, Global Step: 783, Step Time: 4.7571, Loss: 38.4648
  988. Epoch: 6, Batch: 22, Global Step: 784, Step Time: 5.8883, Loss: 38.4694
  989. Epoch: 6, Batch: 23, Global Step: 785, Step Time: 3.9652, Loss: 38.1863
  990. Epoch: 6, Batch: 24, Global Step: 786, Step Time: 4.5282, Loss: 39.1640
  991. Epoch: 6, Batch: 25, Global Step: 787, Step Time: 4.0308, Loss: 39.2610
  992. Epoch: 6, Batch: 26, Global Step: 788, Step Time: 4.2082, Loss: 39.3569
  993. Epoch: 6, Batch: 27, Global Step: 789, Step Time: 3.3849, Loss: 39.2347
  994. Epoch: 6, Batch: 28, Global Step: 790, Step Time: 3.5125, Loss: 39.1321
  995. Epoch: 6, Batch: 29, Global Step: 791, Step Time: 3.8556, Loss: 39.1178
  996. Epoch: 6, Batch: 30, Global Step: 792, Step Time: 4.2755, Loss: 39.1186
  997. Epoch: 6, Batch: 31, Global Step: 793, Step Time: 3.2197, Loss: 39.0122
  998. Epoch: 6, Batch: 32, Global Step: 794, Step Time: 3.9674, Loss: 38.9775
  999. Epoch: 6, Batch: 33, Global Step: 795, Step Time: 3.9557, Loss: 38.8931
  1000. Epoch: 6, Batch: 34, Global Step: 796, Step Time: 4.0820, Loss: 38.8987
  1001. Epoch: 6, Batch: 35, Global Step: 797, Step Time: 3.4377, Loss: 39.0417
  1002. Epoch: 6, Batch: 36, Global Step: 798, Step Time: 3.8916, Loss: 39.0379
  1003. Epoch: 6, Batch: 37, Global Step: 799, Step Time: 4.9556, Loss: 39.1232
  1004. Epoch: 6, Batch: 38, Global Step: 800, Step Time: 5.4447, Loss: 39.3350
  1005. Epoch: 6, Batch: 39, Global Step: 801, Step Time: 4.4940, Loss: 39.3946
  1006. Epoch: 6, Batch: 40, Global Step: 802, Step Time: 2.8465, Loss: 39.3793
  1007. Epoch: 6, Batch: 41, Global Step: 803, Step Time: 4.3733, Loss: 39.7279
  1008. Epoch: 6, Batch: 42, Global Step: 804, Step Time: 4.2816, Loss: 39.7508
  1009. Epoch: 6, Batch: 43, Global Step: 805, Step Time: 5.2374, Loss: 39.8916
  1010. Epoch: 6, Batch: 44, Global Step: 806, Step Time: 5.7693, Loss: 39.9683
  1011. Epoch: 6, Batch: 45, Global Step: 807, Step Time: 7.0167, Loss: 40.1148
  1012. Epoch: 6, Batch: 46, Global Step: 808, Step Time: 4.9269, Loss: 40.0167
  1013. Epoch: 6, Batch: 47, Global Step: 809, Step Time: 5.4036, Loss: 40.0181
  1014. Epoch: 6, Batch: 48, Global Step: 810, Step Time: 4.1781, Loss: 40.0933
  1015. Epoch: 6, Batch: 49, Global Step: 811, Step Time: 3.5316, Loss: 39.9968
  1016. Epoch: 6, Batch: 50, Global Step: 812, Step Time: 5.6251, Loss: 39.9710
  1017. Epoch: 6, Batch: 51, Global Step: 813, Step Time: 7.2826, Loss: 40.1684
  1018. Epoch: 6, Batch: 52, Global Step: 814, Step Time: 6.0910, Loss: 40.2703
  1019. Epoch: 6, Batch: 53, Global Step: 815, Step Time: 6.7296, Loss: 40.2658
  1020. Epoch: 6, Batch: 54, Global Step: 816, Step Time: 5.6884, Loss: 40.1972
  1021. Epoch: 6, Batch: 55, Global Step: 817, Step Time: 4.0481, Loss: 40.1397
  1022. Epoch: 6, Batch: 56, Global Step: 818, Step Time: 3.8101, Loss: 40.0237
  1023. Epoch: 6, Batch: 57, Global Step: 819, Step Time: 3.9490, Loss: 40.0665
  1024. Epoch: 6, Batch: 58, Global Step: 820, Step Time: 3.5981, Loss: 40.0011
  1025. Epoch: 6, Batch: 59, Global Step: 821, Step Time: 3.0275, Loss: 39.9376
  1026. Epoch: 6, Batch: 60, Global Step: 822, Step Time: 3.7190, Loss: 39.9053
  1027. Epoch: 6, Batch: 61, Global Step: 823, Step Time: 3.9669, Loss: 39.8620
  1028. Epoch: 6, Batch: 62, Global Step: 824, Step Time: 4.2459, Loss: 39.7827
  1029. Epoch: 6, Batch: 63, Global Step: 825, Step Time: 4.5952, Loss: 39.9074
  1030. Epoch: 6, Batch: 64, Global Step: 826, Step Time: 7.1378, Loss: 39.9609
  1031. Epoch: 6, Batch: 65, Global Step: 827, Step Time: 4.2754, Loss: 39.8927
  1032. Epoch: 6, Batch: 66, Global Step: 828, Step Time: 3.0387, Loss: 39.7975
  1033. Epoch: 6, Batch: 67, Global Step: 829, Step Time: 4.3526, Loss: 39.8103
  1034. Epoch: 6, Batch: 68, Global Step: 830, Step Time: 4.8026, Loss: 39.7960
  1035. Epoch: 6, Batch: 69, Global Step: 831, Step Time: 5.4714, Loss: 39.7469
  1036. Epoch: 6, Batch: 70, Global Step: 832, Step Time: 7.8709, Loss: 39.8433
  1037. Epoch: 6, Batch: 71, Global Step: 833, Step Time: 3.6094, Loss: 39.8038
  1038. Epoch: 6, Batch: 72, Global Step: 834, Step Time: 3.7001, Loss: 39.7415
  1039. Epoch: 6, Batch: 73, Global Step: 835, Step Time: 3.6160, Loss: 39.7070
  1040. Epoch: 6, Batch: 74, Global Step: 836, Step Time: 4.2850, Loss: 39.6771
  1041. Epoch: 6, Batch: 75, Global Step: 837, Step Time: 5.2544, Loss: 39.6859
  1042. Epoch: 6, Batch: 76, Global Step: 838, Step Time: 5.9051, Loss: 39.7187
  1043. Epoch: 6, Batch: 77, Global Step: 839, Step Time: 5.3784, Loss: 39.7491
  1044. Epoch: 6, Batch: 78, Global Step: 840, Step Time: 4.6961, Loss: 39.6856
  1045. Epoch: 6, Batch: 79, Global Step: 841, Step Time: 7.9783, Loss: 39.8032
  1046. Epoch: 6, Batch: 80, Global Step: 842, Step Time: 5.0728, Loss: 39.7807
  1047. Epoch: 6, Batch: 81, Global Step: 843, Step Time: 4.4107, Loss: 39.7822
  1048. Epoch: 6, Batch: 82, Global Step: 844, Step Time: 5.8079, Loss: 39.8283
  1049. Epoch: 6, Batch: 83, Global Step: 845, Step Time: 4.6378, Loss: 39.8289
  1050. Epoch: 6, Batch: 84, Global Step: 846, Step Time: 7.5235, Loss: 40.0575
  1051. Epoch: 6, Batch: 85, Global Step: 847, Step Time: 4.1219, Loss: 40.0306
  1052. Epoch: 6, Batch: 86, Global Step: 848, Step Time: 5.7813, Loss: 40.1726
  1053. Epoch: 6, Batch: 87, Global Step: 849, Step Time: 5.3706, Loss: 40.1150
  1054. Epoch: 6, Batch: 88, Global Step: 850, Step Time: 5.4283, Loss: 40.0971
  1055. Epoch: 6, Batch: 89, Global Step: 851, Step Time: 5.5974, Loss: 40.1497
  1056. Epoch: 6, Batch: 90, Global Step: 852, Step Time: 3.9205, Loss: 40.1618
  1057. Epoch: 6, Batch: 91, Global Step: 853, Step Time: 3.8916, Loss: 40.1164
  1058. Epoch: 6, Batch: 92, Global Step: 854, Step Time: 5.3771, Loss: 40.1597
  1059. Epoch: 6, Batch: 93, Global Step: 855, Step Time: 4.5575, Loss: 40.2007
  1060. Epoch: 6, Batch: 94, Global Step: 856, Step Time: 3.2471, Loss: 40.1645
  1061. Epoch: 6, Batch: 95, Global Step: 857, Step Time: 4.1536, Loss: 40.1419
  1062. Epoch: 6, Batch: 96, Global Step: 858, Step Time: 3.9110, Loss: 40.1369
  1063. Epoch: 6, Batch: 97, Global Step: 859, Step Time: 5.5408, Loss: 40.2013
  1064. Epoch: 6, Batch: 98, Global Step: 860, Step Time: 3.4201, Loss: 40.1310
  1065. Epoch: 6, Batch: 99, Global Step: 861, Step Time: 6.0238, Loss: 40.1419
  1066. Epoch: 6, Batch: 100, Global Step: 862, Step Time: 6.1508, Loss: 40.1288
  1067. Epoch: 6, Batch: 101, Global Step: 863, Step Time: 4.0266, Loss: 40.1149
  1068. Epoch: 6, Batch: 102, Global Step: 864, Step Time: 6.7469, Loss: 40.1148
  1069. Epoch: 6, Batch: 103, Global Step: 865, Step Time: 3.7517, Loss: 40.0534
  1070. Epoch: 6, Batch: 104, Global Step: 866, Step Time: 3.4029, Loss: 40.0665
  1071. Epoch: 6, Batch: 105, Global Step: 867, Step Time: 6.1968, Loss: 40.0522
  1072. Epoch: 6, Batch: 106, Global Step: 868, Step Time: 6.4600, Loss: 40.0594
  1073. Epoch: 6, Batch: 107, Global Step: 869, Step Time: 4.5990, Loss: 40.0795
  1074. Epoch: 6, Batch: 108, Global Step: 870, Step Time: 4.8470, Loss: 40.0515
  1075. Epoch: 6, Batch: 109, Global Step: 871, Step Time: 4.9509, Loss: 40.0622
  1076. Epoch: 6, Batch: 110, Global Step: 872, Step Time: 3.1104, Loss: 40.0392
  1077. Epoch: 6, Batch: 111, Global Step: 873, Step Time: 4.4001, Loss: 40.0344
  1078. Epoch: 6, Batch: 112, Global Step: 874, Step Time: 3.5457, Loss: 40.0071
  1079. Epoch: 6, Batch: 113, Global Step: 875, Step Time: 6.0096, Loss: 40.0406
  1080. Epoch: 6, Batch: 114, Global Step: 876, Step Time: 6.0970, Loss: 40.1031
  1081. Epoch: 6, Batch: 115, Global Step: 877, Step Time: 3.6439, Loss: 40.0615
  1082. Epoch: 6, Batch: 116, Global Step: 878, Step Time: 3.4823, Loss: 40.0468
  1083. Epoch: 6, Batch: 117, Global Step: 879, Step Time: 3.8070, Loss: 40.0187
  1084. Epoch: 6, Batch: 118, Global Step: 880, Step Time: 5.1396, Loss: 40.0220
  1085. Epoch: 6, Batch: 119, Global Step: 881, Step Time: 6.2310, Loss: 40.0333
  1086. Epoch: 6, Batch: 120, Global Step: 882, Step Time: 4.0266, Loss: 40.0346
  1087. Epoch: 6, Batch: 121, Global Step: 883, Step Time: 4.5244, Loss: 39.9690
  1088. Epoch: 6, Batch: 122, Global Step: 884, Step Time: 5.3778, Loss: 39.9753
  1089. Epoch: 6, Batch: 123, Global Step: 885, Step Time: 6.2243, Loss: 40.1272
  1090. Epoch: 6, Batch: 124, Global Step: 886, Step Time: 4.6396, Loss: 40.0933
  1091. Epoch: 6, Batch: 125, Global Step: 887, Step Time: 4.0557, Loss: 40.0959
  1092. Epoch: 6, Batch: 126, Global Step: 888, Step Time: 3.8169, Loss: 40.0239
  1093. EPOCH RESULTS: Loss: 40.0239
  1094. Epoch: 7, Batch: 0, Global Step: 889, Step Time: 4.5063, Loss: 39.1382
  1095. Epoch: 7, Batch: 1, Global Step: 890, Step Time: 3.3300, Loss: 37.6319
  1096. Epoch: 7, Batch: 2, Global Step: 891, Step Time: 3.4642, Loss: 35.7552
  1097. Epoch: 7, Batch: 3, Global Step: 892, Step Time: 3.4936, Loss: 35.6730
  1098. Epoch: 7, Batch: 4, Global Step: 893, Step Time: 3.4120, Loss: 35.5839
  1099. Epoch: 7, Batch: 5, Global Step: 894, Step Time: 4.8580, Loss: 35.9999
  1100. Epoch: 7, Batch: 6, Global Step: 895, Step Time: 3.3466, Loss: 35.4963
  1101. Epoch: 7, Batch: 7, Global Step: 896, Step Time: 3.1047, Loss: 35.6825
  1102. Epoch: 7, Batch: 8, Global Step: 897, Step Time: 4.0515, Loss: 35.9189
  1103. Epoch: 7, Batch: 9, Global Step: 898, Step Time: 4.0264, Loss: 36.5420
  1104. Epoch: 7, Batch: 10, Global Step: 899, Step Time: 4.6669, Loss: 37.3808
  1105. Epoch: 7, Batch: 11, Global Step: 900, Step Time: 4.8019, Loss: 37.8814
  1106. Epoch: 7, Batch: 12, Global Step: 901, Step Time: 5.7870, Loss: 37.8489
  1107. Epoch: 7, Batch: 13, Global Step: 902, Step Time: 3.7514, Loss: 37.7955
  1108. Epoch: 7, Batch: 14, Global Step: 903, Step Time: 4.3674, Loss: 37.7843
  1109. Epoch: 7, Batch: 15, Global Step: 904, Step Time: 4.8865, Loss: 38.0607
  1110. Epoch: 7, Batch: 16, Global Step: 905, Step Time: 5.3246, Loss: 38.0983
  1111. Epoch: 7, Batch: 17, Global Step: 906, Step Time: 5.2444, Loss: 38.4759
  1112. Epoch: 7, Batch: 18, Global Step: 907, Step Time: 4.4179, Loss: 38.2775
  1113. Epoch: 7, Batch: 19, Global Step: 908, Step Time: 4.4847, Loss: 38.5358
  1114. Epoch: 7, Batch: 20, Global Step: 909, Step Time: 7.1356, Loss: 38.3879
  1115. Epoch: 7, Batch: 21, Global Step: 910, Step Time: 4.6293, Loss: 38.2592
  1116. Epoch: 7, Batch: 22, Global Step: 911, Step Time: 5.6441, Loss: 38.2643
  1117. Epoch: 7, Batch: 23, Global Step: 912, Step Time: 3.9730, Loss: 37.9830
  1118. Epoch: 7, Batch: 24, Global Step: 913, Step Time: 4.4684, Loss: 38.9331
  1119. Epoch: 7, Batch: 25, Global Step: 914, Step Time: 3.9873, Loss: 39.0306
  1120. Epoch: 7, Batch: 26, Global Step: 915, Step Time: 4.0889, Loss: 39.1264
  1121. Epoch: 7, Batch: 27, Global Step: 916, Step Time: 3.3779, Loss: 39.0035
  1122. Epoch: 7, Batch: 28, Global Step: 917, Step Time: 3.4305, Loss: 38.9044
  1123. Epoch: 7, Batch: 29, Global Step: 918, Step Time: 3.9619, Loss: 38.8906
  1124. Epoch: 7, Batch: 30, Global Step: 919, Step Time: 4.1982, Loss: 38.8841
  1125. Epoch: 7, Batch: 31, Global Step: 920, Step Time: 3.2336, Loss: 38.7777
  1126. Epoch: 7, Batch: 32, Global Step: 921, Step Time: 3.9079, Loss: 38.7443
  1127. Epoch: 7, Batch: 33, Global Step: 922, Step Time: 4.0012, Loss: 38.6617
  1128. Epoch: 7, Batch: 34, Global Step: 923, Step Time: 4.0557, Loss: 38.6647
  1129. Epoch: 7, Batch: 35, Global Step: 924, Step Time: 3.5094, Loss: 38.8042
  1130. Epoch: 7, Batch: 36, Global Step: 925, Step Time: 3.7928, Loss: 38.7980
  1131. Epoch: 7, Batch: 37, Global Step: 926, Step Time: 5.0427, Loss: 38.8823
  1132. Epoch: 7, Batch: 38, Global Step: 927, Step Time: 5.3824, Loss: 39.0913
  1133. Epoch: 7, Batch: 39, Global Step: 928, Step Time: 4.4900, Loss: 39.1505
  1134. Epoch: 7, Batch: 40, Global Step: 929, Step Time: 2.8222, Loss: 39.1339
  1135. Epoch: 7, Batch: 41, Global Step: 930, Step Time: 4.3316, Loss: 39.4730
  1136. Epoch: 7, Batch: 42, Global Step: 931, Step Time: 4.2123, Loss: 39.4951
  1137. Epoch: 7, Batch: 43, Global Step: 932, Step Time: 5.0715, Loss: 39.6349
  1138. Epoch: 7, Batch: 44, Global Step: 933, Step Time: 5.5915, Loss: 39.7128
  1139. Epoch: 7, Batch: 45, Global Step: 934, Step Time: 7.1045, Loss: 39.8572
  1140. Epoch: 7, Batch: 46, Global Step: 935, Step Time: 4.8695, Loss: 39.7590
  1141. Epoch: 7, Batch: 47, Global Step: 936, Step Time: 5.2516, Loss: 39.7595
  1142. Epoch: 7, Batch: 48, Global Step: 937, Step Time: 4.2272, Loss: 39.8302
  1143. Epoch: 7, Batch: 49, Global Step: 938, Step Time: 3.4751, Loss: 39.7362
  1144. Epoch: 7, Batch: 50, Global Step: 939, Step Time: 5.4541, Loss: 39.7095
  1145. Epoch: 7, Batch: 51, Global Step: 940, Step Time: 7.0751, Loss: 39.9013
  1146. Epoch: 7, Batch: 52, Global Step: 941, Step Time: 5.8622, Loss: 40.0035
  1147. Epoch: 7, Batch: 53, Global Step: 942, Step Time: 6.8041, Loss: 39.9972
  1148. Epoch: 7, Batch: 54, Global Step: 943, Step Time: 5.7998, Loss: 39.9282
  1149. Epoch: 7, Batch: 55, Global Step: 944, Step Time: 3.9705, Loss: 39.8683
  1150. Epoch: 7, Batch: 56, Global Step: 945, Step Time: 3.7410, Loss: 39.7504
  1151. Epoch: 7, Batch: 57, Global Step: 946, Step Time: 4.0100, Loss: 39.7953
  1152. Epoch: 7, Batch: 58, Global Step: 947, Step Time: 3.6288, Loss: 39.7337
  1153. Epoch: 7, Batch: 59, Global Step: 948, Step Time: 3.0062, Loss: 39.6703
  1154. Epoch: 7, Batch: 60, Global Step: 949, Step Time: 3.7494, Loss: 39.6381
  1155. Epoch: 7, Batch: 61, Global Step: 950, Step Time: 4.0067, Loss: 39.5972
  1156. Epoch: 7, Batch: 62, Global Step: 951, Step Time: 4.1400, Loss: 39.5196
  1157. Epoch: 7, Batch: 63, Global Step: 952, Step Time: 4.3938, Loss: 39.6452
  1158. Epoch: 7, Batch: 64, Global Step: 953, Step Time: 7.1789, Loss: 39.7011
  1159. Epoch: 7, Batch: 65, Global Step: 954, Step Time: 4.2732, Loss: 39.6325
  1160. Epoch: 7, Batch: 66, Global Step: 955, Step Time: 3.0339, Loss: 39.5384
  1161. Epoch: 7, Batch: 67, Global Step: 956, Step Time: 4.1837, Loss: 39.5526
  1162. Epoch: 7, Batch: 68, Global Step: 957, Step Time: 4.8117, Loss: 39.5397
  1163. Epoch: 7, Batch: 69, Global Step: 958, Step Time: 5.5191, Loss: 39.4910
  1164. Epoch: 7, Batch: 70, Global Step: 959, Step Time: 7.9773, Loss: 39.5890
  1165. Epoch: 7, Batch: 71, Global Step: 960, Step Time: 3.6549, Loss: 39.5488
  1166. Epoch: 7, Batch: 72, Global Step: 961, Step Time: 3.7362, Loss: 39.4879
  1167. Epoch: 7, Batch: 73, Global Step: 962, Step Time: 3.6559, Loss: 39.4542
  1168. Epoch: 7, Batch: 74, Global Step: 963, Step Time: 4.4149, Loss: 39.4259
  1169. Epoch: 7, Batch: 75, Global Step: 964, Step Time: 5.1139, Loss: 39.4342
  1170. Epoch: 7, Batch: 76, Global Step: 965, Step Time: 5.9875, Loss: 39.4684
  1171. Epoch: 7, Batch: 77, Global Step: 966, Step Time: 5.5364, Loss: 39.4990
  1172. Epoch: 7, Batch: 78, Global Step: 967, Step Time: 4.8281, Loss: 39.4355
  1173. Epoch: 7, Batch: 79, Global Step: 968, Step Time: 8.1809, Loss: 39.5527
  1174. Epoch: 7, Batch: 80, Global Step: 969, Step Time: 5.2648, Loss: 39.5297
  1175. Epoch: 7, Batch: 81, Global Step: 970, Step Time: 4.3387, Loss: 39.5306
  1176. Epoch: 7, Batch: 82, Global Step: 971, Step Time: 5.6359, Loss: 39.5757
  1177. Epoch: 7, Batch: 83, Global Step: 972, Step Time: 4.5119, Loss: 39.5763
  1178. Epoch: 7, Batch: 84, Global Step: 973, Step Time: 7.6130, Loss: 39.8003
  1179. Epoch: 7, Batch: 85, Global Step: 974, Step Time: 4.2537, Loss: 39.7753
  1180. Epoch: 7, Batch: 86, Global Step: 975, Step Time: 5.9289, Loss: 39.9167
  1181. Epoch: 7, Batch: 87, Global Step: 976, Step Time: 5.4182, Loss: 39.8578
  1182. Epoch: 7, Batch: 88, Global Step: 977, Step Time: 5.5786, Loss: 39.8405
  1183. Epoch: 7, Batch: 89, Global Step: 978, Step Time: 5.7307, Loss: 39.8935
  1184. Epoch: 7, Batch: 90, Global Step: 979, Step Time: 3.9623, Loss: 39.9056
  1185. Epoch: 7, Batch: 91, Global Step: 980, Step Time: 3.7342, Loss: 39.8609
  1186. Epoch: 7, Batch: 92, Global Step: 981, Step Time: 5.4361, Loss: 39.9039
  1187. Epoch: 7, Batch: 93, Global Step: 982, Step Time: 4.6905, Loss: 39.9452
  1188. Epoch: 7, Batch: 94, Global Step: 983, Step Time: 3.3364, Loss: 39.9097
  1189. Epoch: 7, Batch: 95, Global Step: 984, Step Time: 4.0578, Loss: 39.8866
  1190. Epoch: 7, Batch: 96, Global Step: 985, Step Time: 3.7557, Loss: 39.8826
  1191. Epoch: 7, Batch: 97, Global Step: 986, Step Time: 5.5269, Loss: 39.9470
  1192. Epoch: 7, Batch: 98, Global Step: 987, Step Time: 3.3547, Loss: 39.8768
  1193. Epoch: 7, Batch: 99, Global Step: 988, Step Time: 6.1087, Loss: 39.8884
  1194. Epoch: 7, Batch: 100, Global Step: 989, Step Time: 6.2433, Loss: 39.8757
  1195. Epoch: 7, Batch: 101, Global Step: 990, Step Time: 3.9604, Loss: 39.8622
  1196. Epoch: 7, Batch: 102, Global Step: 991, Step Time: 6.5689, Loss: 39.8613
  1197. Epoch: 7, Batch: 103, Global Step: 992, Step Time: 3.7641, Loss: 39.8009
  1198. Epoch: 7, Batch: 104, Global Step: 993, Step Time: 3.2945, Loss: 39.8150
  1199. Epoch: 7, Batch: 105, Global Step: 994, Step Time: 5.9657, Loss: 39.8024
  1200. Epoch: 7, Batch: 106, Global Step: 995, Step Time: 6.2516, Loss: 39.8108
  1201. Epoch: 7, Batch: 107, Global Step: 996, Step Time: 4.4007, Loss: 39.8318
  1202. Epoch: 7, Batch: 108, Global Step: 997, Step Time: 4.6625, Loss: 39.8027
  1203. Epoch: 7, Batch: 109, Global Step: 998, Step Time: 4.7998, Loss: 39.8126
  1204. Epoch: 7, Batch: 110, Global Step: 999, Step Time: 3.0784, Loss: 39.7917
  1205. Performing evaluation.
  1206. VALIDATION RESULTS: Time: 105.8352, Loss: 73.7802, Accuracy: 0.5767, WER: 1.0000
  1207. Saving checkpoint ./model/checkpoint_1000_73.7802.hdf5
  1208. Epoch: 7, Batch: 111, Global Step: 1000, Step Time: 4.4647, Loss: 39.7883
  1209. Epoch: 7, Batch: 112, Global Step: 1001, Step Time: 3.5597, Loss: 39.7620
  1210. Epoch: 7, Batch: 113, Global Step: 1002, Step Time: 6.0396, Loss: 39.7958
  1211. Epoch: 7, Batch: 114, Global Step: 1003, Step Time: 6.0586, Loss: 39.8585
  1212. Epoch: 7, Batch: 115, Global Step: 1004, Step Time: 3.6106, Loss: 39.8181
  1213. Epoch: 7, Batch: 116, Global Step: 1005, Step Time: 3.4242, Loss: 39.8051
  1214. Epoch: 7, Batch: 117, Global Step: 1006, Step Time: 3.8342, Loss: 39.7775
  1215. Epoch: 7, Batch: 118, Global Step: 1007, Step Time: 5.1918, Loss: 39.7803
  1216. Epoch: 7, Batch: 119, Global Step: 1008, Step Time: 6.2207, Loss: 39.7918
  1217. Epoch: 7, Batch: 120, Global Step: 1009, Step Time: 4.0539, Loss: 39.7938
  1218. Epoch: 7, Batch: 121, Global Step: 1010, Step Time: 4.5231, Loss: 39.7284
  1219. Epoch: 7, Batch: 122, Global Step: 1011, Step Time: 5.3932, Loss: 39.7346
  1220. Epoch: 7, Batch: 123, Global Step: 1012, Step Time: 6.3059, Loss: 39.8830
  1221. Epoch: 7, Batch: 124, Global Step: 1013, Step Time: 4.6802, Loss: 39.8496
  1222. Epoch: 7, Batch: 125, Global Step: 1014, Step Time: 4.0830, Loss: 39.8519
  1223. Epoch: 7, Batch: 126, Global Step: 1015, Step Time: 3.8679, Loss: 39.7805
  1224. EPOCH RESULTS: Loss: 39.7805
  1225. Epoch: 8, Batch: 0, Global Step: 1016, Step Time: 4.5218, Loss: 38.6200
  1226. Epoch: 8, Batch: 1, Global Step: 1017, Step Time: 3.3437, Loss: 37.2526
  1227. Epoch: 8, Batch: 2, Global Step: 1018, Step Time: 3.4732, Loss: 35.4412
  1228. Epoch: 8, Batch: 3, Global Step: 1019, Step Time: 3.4844, Loss: 35.4654
  1229. Epoch: 8, Batch: 4, Global Step: 1020, Step Time: 3.4390, Loss: 35.4352
  1230. Epoch: 8, Batch: 5, Global Step: 1021, Step Time: 4.8376, Loss: 35.8373
  1231. Epoch: 8, Batch: 6, Global Step: 1022, Step Time: 3.3810, Loss: 35.3473
  1232. Epoch: 8, Batch: 7, Global Step: 1023, Step Time: 3.1180, Loss: 35.5463
  1233. Epoch: 8, Batch: 8, Global Step: 1024, Step Time: 4.0861, Loss: 35.7804
  1234. Epoch: 8, Batch: 9, Global Step: 1025, Step Time: 4.0263, Loss: 36.3979
  1235. Epoch: 8, Batch: 10, Global Step: 1026, Step Time: 4.6707, Loss: 37.2465
  1236. Epoch: 8, Batch: 11, Global Step: 1027, Step Time: 4.8361, Loss: 37.7461
  1237. Epoch: 8, Batch: 12, Global Step: 1028, Step Time: 5.7642, Loss: 37.7088
  1238. Epoch: 8, Batch: 13, Global Step: 1029, Step Time: 3.7970, Loss: 37.6557
  1239. Epoch: 8, Batch: 14, Global Step: 1030, Step Time: 4.3874, Loss: 37.6471
  1240. Epoch: 8, Batch: 15, Global Step: 1031, Step Time: 4.8734, Loss: 37.9317
  1241. Epoch: 8, Batch: 16, Global Step: 1032, Step Time: 5.3806, Loss: 37.9614
  1242. Epoch: 8, Batch: 17, Global Step: 1033, Step Time: 5.2250, Loss: 38.3270
  1243. Epoch: 8, Batch: 18, Global Step: 1034, Step Time: 4.4684, Loss: 38.1270
  1244. Epoch: 8, Batch: 19, Global Step: 1035, Step Time: 4.5199, Loss: 38.3789
  1245. Epoch: 8, Batch: 20, Global Step: 1036, Step Time: 7.0949, Loss: 38.2310
  1246. Epoch: 8, Batch: 21, Global Step: 1037, Step Time: 4.6516, Loss: 38.0992
  1247. Epoch: 8, Batch: 22, Global Step: 1038, Step Time: 5.6728, Loss: 38.0984
  1248. Epoch: 8, Batch: 23, Global Step: 1039, Step Time: 3.9761, Loss: 37.8197
  1249. Epoch: 8, Batch: 24, Global Step: 1040, Step Time: 4.5097, Loss: 38.7450
  1250. Epoch: 8, Batch: 25, Global Step: 1041, Step Time: 4.0609, Loss: 38.8405
  1251. Epoch: 8, Batch: 26, Global Step: 1042, Step Time: 4.1558, Loss: 38.9346
  1252. Epoch: 8, Batch: 27, Global Step: 1043, Step Time: 3.4150, Loss: 38.8085
  1253. Epoch: 8, Batch: 28, Global Step: 1044, Step Time: 3.4330, Loss: 38.7098
  1254. Epoch: 8, Batch: 29, Global Step: 1045, Step Time: 3.9712, Loss: 38.6934
  1255. Epoch: 8, Batch: 30, Global Step: 1046, Step Time: 4.2153, Loss: 38.6815
  1256. Epoch: 8, Batch: 31, Global Step: 1047, Step Time: 3.2925, Loss: 38.5767
  1257. Epoch: 8, Batch: 32, Global Step: 1048, Step Time: 3.9251, Loss: 38.5439
  1258. Epoch: 8, Batch: 33, Global Step: 1049, Step Time: 4.0543, Loss: 38.4624
  1259. Epoch: 8, Batch: 34, Global Step: 1050, Step Time: 4.0505, Loss: 38.4635
  1260. Epoch: 8, Batch: 35, Global Step: 1051, Step Time: 3.5023, Loss: 38.6008
  1261. Epoch: 8, Batch: 36, Global Step: 1052, Step Time: 3.8059, Loss: 38.5933
  1262. Epoch: 8, Batch: 37, Global Step: 1053, Step Time: 5.0269, Loss: 38.6742
  1263. Epoch: 8, Batch: 38, Global Step: 1054, Step Time: 5.4318, Loss: 38.8777
  1264. Epoch: 8, Batch: 39, Global Step: 1055, Step Time: 4.5343, Loss: 38.9369
  1265. Epoch: 8, Batch: 40, Global Step: 1056, Step Time: 2.8797, Loss: 38.9182
  1266. Epoch: 8, Batch: 41, Global Step: 1057, Step Time: 4.3130, Loss: 39.2478
  1267. Epoch: 8, Batch: 42, Global Step: 1058, Step Time: 4.2431, Loss: 39.2698
  1268. Epoch: 8, Batch: 43, Global Step: 1059, Step Time: 5.1166, Loss: 39.4087
  1269. Epoch: 8, Batch: 44, Global Step: 1060, Step Time: 5.5983, Loss: 39.4863
  1270. Epoch: 8, Batch: 45, Global Step: 1061, Step Time: 7.1514, Loss: 39.6275
  1271. Epoch: 8, Batch: 46, Global Step: 1062, Step Time: 4.8589, Loss: 39.5302
  1272. Epoch: 8, Batch: 47, Global Step: 1063, Step Time: 5.2503, Loss: 39.5308
  1273. Epoch: 8, Batch: 48, Global Step: 1064, Step Time: 4.3216, Loss: 39.6014
  1274. Epoch: 8, Batch: 49, Global Step: 1065, Step Time: 3.4582, Loss: 39.5068
  1275. Epoch: 8, Batch: 50, Global Step: 1066, Step Time: 5.4892, Loss: 39.4795
  1276. Epoch: 8, Batch: 51, Global Step: 1067, Step Time: 7.1000, Loss: 39.6693
  1277. Epoch: 8, Batch: 52, Global Step: 1068, Step Time: 5.9067, Loss: 39.7720
  1278. Epoch: 8, Batch: 53, Global Step: 1069, Step Time: 6.8319, Loss: 39.7641
  1279. Epoch: 8, Batch: 54, Global Step: 1070, Step Time: 5.8533, Loss: 39.6951
  1280. Epoch: 8, Batch: 55, Global Step: 1071, Step Time: 4.0001, Loss: 39.6332
  1281. Epoch: 8, Batch: 56, Global Step: 1072, Step Time: 3.7135, Loss: 39.5144
  1282. Epoch: 8, Batch: 57, Global Step: 1073, Step Time: 4.0418, Loss: 39.5599
  1283. Epoch: 8, Batch: 58, Global Step: 1074, Step Time: 3.7219, Loss: 39.5005
  1284. Epoch: 8, Batch: 59, Global Step: 1075, Step Time: 3.0921, Loss: 39.4362
  1285. Epoch: 8, Batch: 60, Global Step: 1076, Step Time: 3.8213, Loss: 39.4045
  1286. Epoch: 8, Batch: 61, Global Step: 1077, Step Time: 4.1089, Loss: 39.3640
  1287. Epoch: 8, Batch: 62, Global Step: 1078, Step Time: 4.1361, Loss: 39.2866
  1288. Epoch: 8, Batch: 63, Global Step: 1079, Step Time: 4.4703, Loss: 39.4152
  1289. Epoch: 8, Batch: 64, Global Step: 1080, Step Time: 7.2686, Loss: 39.4727
  1290. Epoch: 8, Batch: 65, Global Step: 1081, Step Time: 4.3916, Loss: 39.4014
  1291. Epoch: 8, Batch: 66, Global Step: 1082, Step Time: 3.0991, Loss: 39.3081
  1292. Epoch: 8, Batch: 67, Global Step: 1083, Step Time: 4.2945, Loss: 39.3207
  1293. Epoch: 8, Batch: 68, Global Step: 1084, Step Time: 4.8981, Loss: 39.3080
  1294. Epoch: 8, Batch: 69, Global Step: 1085, Step Time: 5.6380, Loss: 39.2580
  1295. Epoch: 8, Batch: 70, Global Step: 1086, Step Time: 8.1407, Loss: 39.3569
  1296. Epoch: 8, Batch: 71, Global Step: 1087, Step Time: 3.7241, Loss: 39.3175
  1297. Epoch: 8, Batch: 72, Global Step: 1088, Step Time: 3.8103, Loss: 39.2564
  1298. Epoch: 8, Batch: 73, Global Step: 1089, Step Time: 3.7095, Loss: 39.2235
  1299. Epoch: 8, Batch: 74, Global Step: 1090, Step Time: 4.4108, Loss: 39.1962
  1300. Epoch: 8, Batch: 75, Global Step: 1091, Step Time: 5.1347, Loss: 39.2036
  1301. Epoch: 8, Batch: 76, Global Step: 1092, Step Time: 6.0301, Loss: 39.2397
  1302. Epoch: 8, Batch: 77, Global Step: 1093, Step Time: 5.5764, Loss: 39.2709
  1303. Epoch: 8, Batch: 78, Global Step: 1094, Step Time: 4.8643, Loss: 39.2078
  1304. Epoch: 8, Batch: 79, Global Step: 1095, Step Time: 8.2896, Loss: 39.3249
  1305. Epoch: 8, Batch: 80, Global Step: 1096, Step Time: 5.2361, Loss: 39.3017
  1306. Epoch: 8, Batch: 81, Global Step: 1097, Step Time: 4.3674, Loss: 39.3020
  1307. Epoch: 8, Batch: 82, Global Step: 1098, Step Time: 5.6721, Loss: 39.3470
  1308. Epoch: 8, Batch: 83, Global Step: 1099, Step Time: 4.5462, Loss: 39.3476
  1309. Epoch: 8, Batch: 84, Global Step: 1100, Step Time: 7.6622, Loss: 39.5693
  1310. Epoch: 8, Batch: 85, Global Step: 1101, Step Time: 4.2278, Loss: 39.5437
  1311. Epoch: 8, Batch: 86, Global Step: 1102, Step Time: 6.0157, Loss: 39.6845
  1312. Epoch: 8, Batch: 87, Global Step: 1103, Step Time: 5.4929, Loss: 39.6251
  1313. Epoch: 8, Batch: 88, Global Step: 1104, Step Time: 5.6041, Loss: 39.6077
  1314. Epoch: 8, Batch: 89, Global Step: 1105, Step Time: 5.7622, Loss: 39.6608
  1315. Epoch: 8, Batch: 90, Global Step: 1106, Step Time: 4.0126, Loss: 39.6719
  1316. Epoch: 8, Batch: 91, Global Step: 1107, Step Time: 3.8435, Loss: 39.6262
  1317. Epoch: 8, Batch: 92, Global Step: 1108, Step Time: 5.5439, Loss: 39.6690
  1318. Epoch: 8, Batch: 93, Global Step: 1109, Step Time: 4.7043, Loss: 39.7104
  1319. Epoch: 8, Batch: 94, Global Step: 1110, Step Time: 3.2984, Loss: 39.6753
  1320. Epoch: 8, Batch: 95, Global Step: 1111, Step Time: 4.1792, Loss: 39.6510
  1321. Epoch: 8, Batch: 96, Global Step: 1112, Step Time: 3.8038, Loss: 39.6468
  1322. Epoch: 8, Batch: 97, Global Step: 1113, Step Time: 5.7049, Loss: 39.7106
  1323. Epoch: 8, Batch: 98, Global Step: 1114, Step Time: 3.3656, Loss: 39.6410
  1324. Epoch: 8, Batch: 99, Global Step: 1115, Step Time: 6.1684, Loss: 39.6519
  1325. Epoch: 8, Batch: 100, Global Step: 1116, Step Time: 6.3028, Loss: 39.6389
  1326. Epoch: 8, Batch: 101, Global Step: 1117, Step Time: 4.0098, Loss: 39.6252
  1327. Epoch: 8, Batch: 102, Global Step: 1118, Step Time: 6.5956, Loss: 39.6235
  1328. Epoch: 8, Batch: 103, Global Step: 1119, Step Time: 3.8640, Loss: 39.5645
  1329. Epoch: 8, Batch: 104, Global Step: 1120, Step Time: 3.3868, Loss: 39.5795
  1330. Epoch: 8, Batch: 105, Global Step: 1121, Step Time: 6.0349, Loss: 39.5672
  1331. Epoch: 8, Batch: 106, Global Step: 1122, Step Time: 6.3363, Loss: 39.5751
  1332. Epoch: 8, Batch: 107, Global Step: 1123, Step Time: 4.4853, Loss: 39.5958
  1333. Epoch: 8, Batch: 108, Global Step: 1124, Step Time: 4.7010, Loss: 39.5661
  1334. Epoch: 8, Batch: 109, Global Step: 1125, Step Time: 4.7559, Loss: 39.5754
  1335. Epoch: 8, Batch: 110, Global Step: 1126, Step Time: 3.1009, Loss: 39.5555
  1336. Epoch: 8, Batch: 111, Global Step: 1127, Step Time: 4.4782, Loss: 39.5518
  1337. Epoch: 8, Batch: 112, Global Step: 1128, Step Time: 3.6143, Loss: 39.5255
  1338. Epoch: 8, Batch: 113, Global Step: 1129, Step Time: 6.2479, Loss: 39.5590
  1339. Epoch: 8, Batch: 114, Global Step: 1130, Step Time: 6.2926, Loss: 39.6219
  1340. Epoch: 8, Batch: 115, Global Step: 1131, Step Time: 3.6887, Loss: 39.5817
  1341. Epoch: 8, Batch: 116, Global Step: 1132, Step Time: 3.5075, Loss: 39.5688
  1342. Epoch: 8, Batch: 117, Global Step: 1133, Step Time: 3.7316, Loss: 39.5410
  1343. Epoch: 8, Batch: 118, Global Step: 1134, Step Time: 5.1138, Loss: 39.5434
  1344. Epoch: 8, Batch: 119, Global Step: 1135, Step Time: 6.0783, Loss: 39.5550
  1345. Epoch: 8, Batch: 120, Global Step: 1136, Step Time: 4.1491, Loss: 39.5572
  1346. Epoch: 8, Batch: 121, Global Step: 1137, Step Time: 4.4835, Loss: 39.4914
  1347. Epoch: 8, Batch: 122, Global Step: 1138, Step Time: 5.2121, Loss: 39.4970
  1348. Epoch: 8, Batch: 123, Global Step: 1139, Step Time: 6.1222, Loss: 39.6433
  1349. Epoch: 8, Batch: 124, Global Step: 1140, Step Time: 4.5329, Loss: 39.6105
  1350. Epoch: 8, Batch: 125, Global Step: 1141, Step Time: 4.0046, Loss: 39.6121
  1351. Epoch: 8, Batch: 126, Global Step: 1142, Step Time: 3.7605, Loss: 39.5407
  1352. EPOCH RESULTS: Loss: 39.5407
  1353. Epoch: 9, Batch: 0, Global Step: 1143, Step Time: 4.7070, Loss: 38.4706
  1354. Epoch: 9, Batch: 1, Global Step: 1144, Step Time: 3.2815, Loss: 37.0353
  1355. Epoch: 9, Batch: 2, Global Step: 1145, Step Time: 3.3972, Loss: 35.1960
  1356. Epoch: 9, Batch: 3, Global Step: 1146, Step Time: 3.4332, Loss: 35.1824
  1357. Epoch: 9, Batch: 4, Global Step: 1147, Step Time: 3.4741, Loss: 35.1663
  1358. Epoch: 9, Batch: 5, Global Step: 1148, Step Time: 4.8255, Loss: 35.5628
  1359. Epoch: 9, Batch: 6, Global Step: 1149, Step Time: 3.4274, Loss: 35.0878
  1360. Epoch: 9, Batch: 7, Global Step: 1150, Step Time: 3.2277, Loss: 35.3061
  1361. Epoch: 9, Batch: 8, Global Step: 1151, Step Time: 4.1895, Loss: 35.5557
  1362. Epoch: 9, Batch: 9, Global Step: 1152, Step Time: 4.1446, Loss: 36.1693
  1363. Epoch: 9, Batch: 10, Global Step: 1153, Step Time: 4.7792, Loss: 37.0188
  1364. Epoch: 9, Batch: 11, Global Step: 1154, Step Time: 4.9501, Loss: 37.5196
  1365. Epoch: 9, Batch: 12, Global Step: 1155, Step Time: 5.6885, Loss: 37.4855
  1366. Epoch: 9, Batch: 13, Global Step: 1156, Step Time: 3.7187, Loss: 37.4277
  1367. Epoch: 9, Batch: 14, Global Step: 1157, Step Time: 4.4639, Loss: 37.4172
  1368. Epoch: 9, Batch: 15, Global Step: 1158, Step Time: 5.0526, Loss: 37.7079
  1369. Epoch: 9, Batch: 16, Global Step: 1159, Step Time: 5.5753, Loss: 37.7329
  1370. Epoch: 9, Batch: 17, Global Step: 1160, Step Time: 5.3830, Loss: 38.0898
  1371. Epoch: 9, Batch: 18, Global Step: 1161, Step Time: 4.6021, Loss: 37.8912
  1372. Epoch: 9, Batch: 19, Global Step: 1162, Step Time: 4.4325, Loss: 38.1430
  1373. Epoch: 9, Batch: 20, Global Step: 1163, Step Time: 6.8974, Loss: 37.9944
  1374. Epoch: 9, Batch: 21, Global Step: 1164, Step Time: 4.7521, Loss: 37.8657
  1375. Epoch: 9, Batch: 22, Global Step: 1165, Step Time: 5.8849, Loss: 37.8639
  1376. Epoch: 9, Batch: 23, Global Step: 1166, Step Time: 3.9664, Loss: 37.5846
  1377. Epoch: 9, Batch: 24, Global Step: 1167, Step Time: 4.5087, Loss: 38.4894
  1378. Epoch: 9, Batch: 25, Global Step: 1168, Step Time: 4.0225, Loss: 38.5874
  1379. Epoch: 9, Batch: 26, Global Step: 1169, Step Time: 4.2922, Loss: 38.6802
  1380. Epoch: 9, Batch: 27, Global Step: 1170, Step Time: 3.4663, Loss: 38.5555
  1381. Epoch: 9, Batch: 28, Global Step: 1171, Step Time: 3.5356, Loss: 38.4606
  1382. Epoch: 9, Batch: 29, Global Step: 1172, Step Time: 3.9000, Loss: 38.4469
  1383. Epoch: 9, Batch: 30, Global Step: 1173, Step Time: 4.2714, Loss: 38.4348
  1384. Epoch: 9, Batch: 31, Global Step: 1174, Step Time: 3.2217, Loss: 38.3331
  1385. Epoch: 9, Batch: 32, Global Step: 1175, Step Time: 3.9889, Loss: 38.3017
  1386. Epoch: 9, Batch: 33, Global Step: 1176, Step Time: 3.9504, Loss: 38.2214
  1387. Epoch: 9, Batch: 34, Global Step: 1177, Step Time: 4.0355, Loss: 38.2199
  1388. Epoch: 9, Batch: 35, Global Step: 1178, Step Time: 3.4618, Loss: 38.3565
  1389. Epoch: 9, Batch: 36, Global Step: 1179, Step Time: 3.8927, Loss: 38.3504
  1390. Epoch: 9, Batch: 37, Global Step: 1180, Step Time: 4.9985, Loss: 38.4329
  1391. Epoch: 9, Batch: 38, Global Step: 1181, Step Time: 5.4771, Loss: 38.6330
  1392. Epoch: 9, Batch: 39, Global Step: 1182, Step Time: 4.4584, Loss: 38.6899
  1393. Epoch: 9, Batch: 40, Global Step: 1183, Step Time: 2.8663, Loss: 38.6737
  1394. Epoch: 9, Batch: 41, Global Step: 1184, Step Time: 4.3631, Loss: 39.0014
  1395. Epoch: 9, Batch: 42, Global Step: 1185, Step Time: 4.3356, Loss: 39.0270
  1396. Epoch: 9, Batch: 43, Global Step: 1186, Step Time: 5.2270, Loss: 39.1683
  1397. Epoch: 9, Batch: 44, Global Step: 1187, Step Time: 5.7889, Loss: 39.2486
  1398. Epoch: 9, Batch: 45, Global Step: 1188, Step Time: 7.0492, Loss: 39.3896
  1399. Epoch: 9, Batch: 46, Global Step: 1189, Step Time: 4.9660, Loss: 39.2961
  1400. Epoch: 9, Batch: 47, Global Step: 1190, Step Time: 5.4352, Loss: 39.2994
  1401. Epoch: 9, Batch: 48, Global Step: 1191, Step Time: 4.1816, Loss: 39.3722
  1402. Epoch: 9, Batch: 49, Global Step: 1192, Step Time: 3.5235, Loss: 39.2774
  1403. Epoch: 9, Batch: 50, Global Step: 1193, Step Time: 5.6118, Loss: 39.2518
  1404. Epoch: 9, Batch: 51, Global Step: 1194, Step Time: 7.2714, Loss: 39.4424
  1405. Epoch: 9, Batch: 52, Global Step: 1195, Step Time: 6.0926, Loss: 39.5473
  1406. Epoch: 9, Batch: 53, Global Step: 1196, Step Time: 6.8219, Loss: 39.5393
  1407. Epoch: 9, Batch: 54, Global Step: 1197, Step Time: 5.7139, Loss: 39.4718
  1408. Epoch: 9, Batch: 55, Global Step: 1198, Step Time: 4.0364, Loss: 39.4093
  1409. Epoch: 9, Batch: 56, Global Step: 1199, Step Time: 3.8447, Loss: 39.2898
  1410. Epoch: 9, Batch: 57, Global Step: 1200, Step Time: 3.9931, Loss: 39.3370
  1411. Epoch: 9, Batch: 58, Global Step: 1201, Step Time: 3.5897, Loss: 39.2793
  1412. Epoch: 9, Batch: 59, Global Step: 1202, Step Time: 3.0466, Loss: 39.2150
  1413. Epoch: 9, Batch: 60, Global Step: 1203, Step Time: 3.7465, Loss: 39.1843
  1414. Epoch: 9, Batch: 61, Global Step: 1204, Step Time: 4.0122, Loss: 39.1453
  1415. Epoch: 9, Batch: 62, Global Step: 1205, Step Time: 4.2263, Loss: 39.0685
  1416. Epoch: 9, Batch: 63, Global Step: 1206, Step Time: 4.6223, Loss: 39.1999
  1417. Epoch: 9, Batch: 64, Global Step: 1207, Step Time: 7.1783, Loss: 39.2582
  1418. Epoch: 9, Batch: 65, Global Step: 1208, Step Time: 4.3041, Loss: 39.1849
  1419. Epoch: 9, Batch: 66, Global Step: 1209, Step Time: 3.0462, Loss: 39.0917
  1420. Epoch: 9, Batch: 67, Global Step: 1210, Step Time: 4.3050, Loss: 39.1035
  1421. Epoch: 9, Batch: 68, Global Step: 1211, Step Time: 4.8321, Loss: 39.0915
  1422. Epoch: 9, Batch: 69, Global Step: 1212, Step Time: 5.4606, Loss: 39.0406
  1423. Epoch: 9, Batch: 70, Global Step: 1213, Step Time: 7.8779, Loss: 39.1402
  1424. Epoch: 9, Batch: 71, Global Step: 1214, Step Time: 3.6234, Loss: 39.1028
  1425. Epoch: 9, Batch: 72, Global Step: 1215, Step Time: 3.6952, Loss: 39.0418
  1426. Epoch: 9, Batch: 73, Global Step: 1216, Step Time: 3.6116, Loss: 39.0097
  1427. Epoch: 9, Batch: 74, Global Step: 1217, Step Time: 4.2914, Loss: 38.9834
  1428. Epoch: 9, Batch: 75, Global Step: 1218, Step Time: 5.2253, Loss: 38.9905
  1429. Epoch: 9, Batch: 76, Global Step: 1219, Step Time: 5.9257, Loss: 39.0281
  1430. Epoch: 9, Batch: 77, Global Step: 1220, Step Time: 5.3923, Loss: 39.0600
  1431. Epoch: 9, Batch: 78, Global Step: 1221, Step Time: 4.7457, Loss: 38.9970
  1432. Epoch: 9, Batch: 79, Global Step: 1222, Step Time: 7.9492, Loss: 39.1138
  1433. Epoch: 9, Batch: 80, Global Step: 1223, Step Time: 5.1030, Loss: 39.0907
  1434. Epoch: 9, Batch: 81, Global Step: 1224, Step Time: 4.4182, Loss: 39.0909
  1435. Epoch: 9, Batch: 82, Global Step: 1225, Step Time: 5.8198, Loss: 39.1364
  1436. Epoch: 9, Batch: 83, Global Step: 1226, Step Time: 4.6779, Loss: 39.1366
  1437. Epoch: 9, Batch: 84, Global Step: 1227, Step Time: 7.5471, Loss: 39.3542
  1438. Epoch: 9, Batch: 85, Global Step: 1228, Step Time: 4.1125, Loss: 39.3293
  1439. Epoch: 9, Batch: 86, Global Step: 1229, Step Time: 5.7424, Loss: 39.4709
  1440. Epoch: 9, Batch: 87, Global Step: 1230, Step Time: 5.2518, Loss: 39.4117
  1441. Epoch: 9, Batch: 88, Global Step: 1231, Step Time: 5.4626, Loss: 39.3943
  1442. Epoch: 9, Batch: 89, Global Step: 1232, Step Time: 5.6251, Loss: 39.4479
  1443. Epoch: 9, Batch: 90, Global Step: 1233, Step Time: 3.9208, Loss: 39.4587
  1444. Epoch: 9, Batch: 91, Global Step: 1234, Step Time: 3.8314, Loss: 39.4110
  1445. Epoch: 9, Batch: 92, Global Step: 1235, Step Time: 5.4085, Loss: 39.4540
  1446. Epoch: 9, Batch: 93, Global Step: 1236, Step Time: 4.6269, Loss: 39.4947
  1447. Epoch: 9, Batch: 94, Global Step: 1237, Step Time: 3.2103, Loss: 39.4606
  1448. Epoch: 9, Batch: 95, Global Step: 1238, Step Time: 4.1502, Loss: 39.4362
  1449. Epoch: 9, Batch: 96, Global Step: 1239, Step Time: 3.8863, Loss: 39.4315
  1450. Epoch: 9, Batch: 97, Global Step: 1240, Step Time: 5.5341, Loss: 39.4954
  1451. Epoch: 9, Batch: 98, Global Step: 1241, Step Time: 3.5138, Loss: 39.4266
  1452. Epoch: 9, Batch: 99, Global Step: 1242, Step Time: 6.0745, Loss: 39.4374
  1453. Epoch: 9, Batch: 100, Global Step: 1243, Step Time: 6.1503, Loss: 39.4241
  1454. Epoch: 9, Batch: 101, Global Step: 1244, Step Time: 4.0330, Loss: 39.4099
  1455. Epoch: 9, Batch: 102, Global Step: 1245, Step Time: 6.8051, Loss: 39.4077
  1456. Epoch: 9, Batch: 103, Global Step: 1246, Step Time: 3.7242, Loss: 39.3502
  1457. Epoch: 9, Batch: 104, Global Step: 1247, Step Time: 3.4049, Loss: 39.3659
  1458. Epoch: 9, Batch: 105, Global Step: 1248, Step Time: 6.2528, Loss: 39.3538
  1459. Epoch: 9, Batch: 106, Global Step: 1249, Step Time: 6.4617, Loss: 39.3613
  1460. Epoch: 9, Batch: 107, Global Step: 1250, Step Time: 4.5652, Loss: 39.3816
  1461. Epoch: 9, Batch: 108, Global Step: 1251, Step Time: 4.8073, Loss: 39.3518
  1462. Epoch: 9, Batch: 109, Global Step: 1252, Step Time: 4.8745, Loss: 39.3612
  1463. Epoch: 9, Batch: 110, Global Step: 1253, Step Time: 3.1499, Loss: 39.3423
  1464. Epoch: 9, Batch: 111, Global Step: 1254, Step Time: 4.4049, Loss: 39.3380
  1465. Epoch: 9, Batch: 112, Global Step: 1255, Step Time: 3.5149, Loss: 39.3119
  1466. Epoch: 9, Batch: 113, Global Step: 1256, Step Time: 6.0153, Loss: 39.3446
  1467. Epoch: 9, Batch: 114, Global Step: 1257, Step Time: 6.1082, Loss: 39.4074
  1468. Epoch: 9, Batch: 115, Global Step: 1258, Step Time: 3.5519, Loss: 39.3671
  1469. Epoch: 9, Batch: 116, Global Step: 1259, Step Time: 3.3954, Loss: 39.3541
  1470. Epoch: 9, Batch: 117, Global Step: 1260, Step Time: 3.8078, Loss: 39.3261
  1471. Epoch: 9, Batch: 118, Global Step: 1261, Step Time: 5.1546, Loss: 39.3282
  1472. Epoch: 9, Batch: 119, Global Step: 1262, Step Time: 6.1892, Loss: 39.3400
  1473. Epoch: 9, Batch: 120, Global Step: 1263, Step Time: 4.0361, Loss: 39.3428
  1474. Epoch: 9, Batch: 121, Global Step: 1264, Step Time: 4.5249, Loss: 39.2769
  1475. Epoch: 9, Batch: 122, Global Step: 1265, Step Time: 5.3918, Loss: 39.2826
  1476. Epoch: 9, Batch: 123, Global Step: 1266, Step Time: 6.2330, Loss: 39.4269
  1477. Epoch: 9, Batch: 124, Global Step: 1267, Step Time: 4.6309, Loss: 39.3945
  1478. Epoch: 9, Batch: 125, Global Step: 1268, Step Time: 4.0163, Loss: 39.3957
  1479. Epoch: 9, Batch: 126, Global Step: 1269, Step Time: 3.8020, Loss: 39.3240
  1480. EPOCH RESULTS: Loss: 39.3240
  1481. Performing evaluation.
  1482. VALIDATION RESULTS: Time: 113.8872, Loss: 70.5400, Accuracy: 0.6138, WER: 1.0000
  1483. Saving checkpoint ./model/checkpoint_1270_70.5400.hdf5
Add Comment
Please, Sign In to add comment