Guest User

Untitled

a guest
Sep 14th, 2021
123
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 29.34 KB | None | 0 0
  1. 2021-09-13 15:24:04.008331: I tensorflow/stream_executor/platform/default/dso_loader.cc:53] Successfully opened dynamic library libcudart.so.11.0
  2. WARNING:absl:metadata_connection_config is not provided by IR.
  3. INFO:absl:tensorflow_ranking is not available: No module named 'tensorflow_ranking'
  4. INFO:absl:tensorflow_text is not available: No module named 'tensorflow_text'
  5. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  6. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  7. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  8. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  9. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  10. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  11. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  12. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  13. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  14. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  15. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  16. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  17. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  18. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  19. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  20. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  21. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  22. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  23. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  24. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  25. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  26. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  27. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  28. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  29. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  30. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  31. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  32. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  33. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  34. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  35. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  36. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  37. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  38. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  39. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  40. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  41. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  42. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  43. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  44. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  45. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  46. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  47. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  48. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  49. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  50. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  51. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  52. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  53. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  54. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  55. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  56. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  57. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  58. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  59. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  60. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  61. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  62. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  63. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  64. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  65. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  66. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  67. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  68. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  69. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  70. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  71. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  72. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  73. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  74. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  75. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  76. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  77. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  78. INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
  79. INFO:absl:tensorflow_text is not available: No module named 'tensorflow_text'
  80. INFO:root:Component ImportExampleGen is running.
  81. INFO:absl:Running launcher for node_info {
  82. type {
  83. name: "tfx.components.example_gen.import_example_gen.component.ImportExampleGen"
  84. }
  85. id: "ImportExampleGen"
  86. }
  87. contexts {
  88. contexts {
  89. type {
  90. name: "pipeline"
  91. }
  92. name {
  93. field_value {
  94. string_value: "kubeflow-pipelines-1"
  95. }
  96. }
  97. }
  98. contexts {
  99. type {
  100. name: "pipeline_run"
  101. }
  102. name {
  103. field_value {
  104. string_value: "kubeflow-pipelines-1-c8t8c"
  105. }
  106. }
  107. }
  108. contexts {
  109. type {
  110. name: "node"
  111. }
  112. name {
  113. field_value {
  114. string_value: "kubeflow-pipelines-1.ImportExampleGen"
  115. }
  116. }
  117. }
  118. }
  119. outputs {
  120. outputs {
  121. key: "examples"
  122. value {
  123. artifact_spec {
  124. type {
  125. name: "Examples"
  126. properties {
  127. key: "span"
  128. value: INT
  129. }
  130. properties {
  131. key: "split_names"
  132. value: STRING
  133. }
  134. properties {
  135. key: "version"
  136. value: INT
  137. }
  138. }
  139. }
  140. }
  141. }
  142. }
  143. parameters {
  144. parameters {
  145. key: "input_base"
  146. value {
  147. field_value {
  148. string_value: "gs://ml-pipeline-proto-kubeflowpipelines-default/input_dataset/"
  149. }
  150. }
  151. }
  152. parameters {
  153. key: "input_config"
  154. value {
  155. field_value {
  156. string_value: "{\n \"splits\": [\n {\n \"name\": \"train\",\n \"pattern\": \"train/*\"\n },\n {\n \"name\": \"eval\",\n \"pattern\": \"eval/*\"\n }\n ]\n}"
  157. }
  158. }
  159. }
  160. parameters {
  161. key: "output_config"
  162. value {
  163. field_value {
  164. string_value: "{}"
  165. }
  166. }
  167. }
  168. parameters {
  169. key: "output_data_format"
  170. value {
  171. field_value {
  172. int_value: 6
  173. }
  174. }
  175. }
  176. parameters {
  177. key: "output_file_format"
  178. value {
  179. field_value {
  180. int_value: 5
  181. }
  182. }
  183. }
  184. }
  185. execution_options {
  186. caching_options {
  187. }
  188. }
  189. INFO:absl:MetadataStore with gRPC connection initialized
  190. INFO:root:Adding KFP pod name kubeflow-pipelines-1-c8t8c-2322982872 to execution
  191. INFO:absl:select span and version = (0, None)
  192. INFO:absl:latest span and version = (0, None)
  193. INFO:absl:select span and version = (0, None)
  194. INFO:absl:latest span and version = (0, None)
  195. INFO:absl:MetadataStore with gRPC connection initialized
  196. INFO:absl:Going to run a new execution 3
  197. INFO:absl:Going to run a new execution: ExecutionInfo(execution_id=3, input_dict={}, output_dict=defaultdict(<class 'list'>, {'examples': [Artifact(artifact: uri: "gs://ml-pipeline-proto-kubeflowpipelines-default/tfx_pipeline_output/kubeflow-pipelines-1/ImportExampleGen/examples/3"
  198. custom_properties {
  199. key: "input_fingerprint"
  200. value {
  201. string_value: "split:train,num_files:1,total_bytes:662869926,xor_checksum:1631516430,sum_checksum:1631516430\nsplit:eval,num_files:1,total_bytes:78922753,xor_checksum:1631281912,sum_checksum:1631281912"
  202. }
  203. }
  204. custom_properties {
  205. key: "name"
  206. value {
  207. string_value: "kubeflow-pipelines-1:kubeflow-pipelines-1-c8t8c:ImportExampleGen:examples:0"
  208. }
  209. }
  210. custom_properties {
  211. key: "span"
  212. value {
  213. int_value: 0
  214. }
  215. }
  216. , artifact_type: name: "Examples"
  217. properties {
  218. key: "span"
  219. value: INT
  220. }
  221. properties {
  222. key: "split_names"
  223. value: STRING
  224. }
  225. properties {
  226. key: "version"
  227. value: INT
  228. }
  229. )]}), exec_properties={'output_file_format': 5, 'input_config': '{\n "splits": [\n {\n "name": "train",\n "pattern": "train/*"\n },\n {\n "name": "eval",\n "pattern": "eval/*"\n }\n ]\n}', 'input_base': 'gs://ml-pipeline-proto-kubeflowpipelines-default/input_dataset/', 'output_data_format': 6, 'output_config': '{}', 'span': 0, 'version': None, 'input_fingerprint': 'split:train,num_files:1,total_bytes:662869926,xor_checksum:1631516430,sum_checksum:1631516430\nsplit:eval,num_files:1,total_bytes:78922753,xor_checksum:1631281912,sum_checksum:1631281912'}, execution_output_uri='gs://ml-pipeline-proto-kubeflowpipelines-default/tfx_pipeline_output/kubeflow-pipelines-1/ImportExampleGen/.system/executor_execution/3/executor_output.pb', stateful_working_dir='gs://ml-pipeline-proto-kubeflowpipelines-default/tfx_pipeline_output/kubeflow-pipelines-1/ImportExampleGen/.system/stateful_working_dir/kubeflow-pipelines-1-c8t8c', tmp_dir='gs://ml-pipeline-proto-kubeflowpipelines-default/tfx_pipeline_output/kubeflow-pipelines-1/ImportExampleGen/.system/executor_execution/3/.temp/', pipeline_node=node_info {
  230. type {
  231. name: "tfx.components.example_gen.import_example_gen.component.ImportExampleGen"
  232. }
  233. id: "ImportExampleGen"
  234. }
  235. contexts {
  236. contexts {
  237. type {
  238. name: "pipeline"
  239. }
  240. name {
  241. field_value {
  242. string_value: "kubeflow-pipelines-1"
  243. }
  244. }
  245. }
  246. contexts {
  247. type {
  248. name: "pipeline_run"
  249. }
  250. name {
  251. field_value {
  252. string_value: "kubeflow-pipelines-1-c8t8c"
  253. }
  254. }
  255. }
  256. contexts {
  257. type {
  258. name: "node"
  259. }
  260. name {
  261. field_value {
  262. string_value: "kubeflow-pipelines-1.ImportExampleGen"
  263. }
  264. }
  265. }
  266. }
  267. outputs {
  268. outputs {
  269. key: "examples"
  270. value {
  271. artifact_spec {
  272. type {
  273. name: "Examples"
  274. properties {
  275. key: "span"
  276. value: INT
  277. }
  278. properties {
  279. key: "split_names"
  280. value: STRING
  281. }
  282. properties {
  283. key: "version"
  284. value: INT
  285. }
  286. }
  287. }
  288. }
  289. }
  290. }
  291. parameters {
  292. parameters {
  293. key: "input_base"
  294. value {
  295. field_value {
  296. string_value: "gs://ml-pipeline-proto-kubeflowpipelines-default/input_dataset/"
  297. }
  298. }
  299. }
  300. parameters {
  301. key: "input_config"
  302. value {
  303. field_value {
  304. string_value: "{\n \"splits\": [\n {\n \"name\": \"train\",\n \"pattern\": \"train/*\"\n },\n {\n \"name\": \"eval\",\n \"pattern\": \"eval/*\"\n }\n ]\n}"
  305. }
  306. }
  307. }
  308. parameters {
  309. key: "output_config"
  310. value {
  311. field_value {
  312. string_value: "{}"
  313. }
  314. }
  315. }
  316. parameters {
  317. key: "output_data_format"
  318. value {
  319. field_value {
  320. int_value: 6
  321. }
  322. }
  323. }
  324. parameters {
  325. key: "output_file_format"
  326. value {
  327. field_value {
  328. int_value: 5
  329. }
  330. }
  331. }
  332. }
  333. execution_options {
  334. caching_options {
  335. }
  336. }
  337. , pipeline_info=id: "kubeflow-pipelines-1"
  338. , pipeline_run_id='kubeflow-pipelines-1-c8t8c')
  339. INFO:absl:Generating examples.
  340. INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
  341. INFO:absl:Reading input TFRecord data gs://ml-pipeline-proto-kubeflowpipelines-default/input_dataset/train/*.
  342. INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
  343. INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
  344. INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
  345. INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
  346. INFO:apache_beam.io.gcp.gcsio:Finished listing 2 files in 0.33609795570373535 seconds.
  347. INFO:absl:Reading input TFRecord data gs://ml-pipeline-proto-kubeflowpipelines-default/input_dataset/eval/*.
  348. INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
  349. INFO:apache_beam.io.gcp.gcsio:Finished listing 2 files in 0.14176344871520996 seconds.
  350. WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
  351. INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.31.0
  352. INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function annotate_downstream_side_inputs at 0x7f4b40db47a0> ====================
  353. INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function fix_side_input_pcoll_coders at 0x7f4b40db48c0> ====================
  354. INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f4b40db4d40> ====================
  355. INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f4b40db4dd0> ====================
  356. INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function expand_sdf at 0x7f4b40db4f80> ====================
  357. INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function expand_gbk at 0x7f4b40daf050> ====================
  358. INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sink_flattens at 0x7f4b40daf170> ====================
  359. INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function greedily_fuse at 0x7f4b40daf200> ====================
  360. INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function read_to_impulse at 0x7f4b40daf290> ====================
  361. INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function impulse_to_input at 0x7f4b40daf320> ====================
  362. INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f4b40daf560> ====================
  363. INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function setup_timer_mapping at 0x7f4b40daf4d0> ====================
  364. INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function populate_data_channel_coders at 0x7f4b40daf5f0> ====================
  365. INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
  366. INFO:apache_beam.runners.portability.fn_api_runner.worker_handlers:Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler object at 0x7f4b3d71a290> for environment ref_Environment_default_environment_1 (beam:env:embedded_python:v1, b'')
  367. INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((ref_AppliedPTransform_InputToRecord-train-_ImportSerializedRecord-output_file_format-5-input_config-_6)+(ref_AppliedPTransform_InputToRecord-train-_ImportSerializedRecord-output_file_format-5-input_config-_7))+(InputToRecord[train]/_ImportSerializedRecord({'output_file_format': 5, 'input_config': '{\n "splits": [\n {\n "name": "train",\n "pattern": "train/*"\n },\n {\n "name": "eval",\n "pattern": "eval/*"\n }\n ]\n}', 'input_base': 'gs://ml-pipeline-proto-kubeflowpipelines-default/input_dataset/', 'output_data_format': 6, 'output_config': '{}', 'span': 0, 'version': None, 'input_fingerprint': 'split:train,num_files:1,total_bytes:662869926,xor_checksum:1631516430,sum_checksum:1631516430\nsplit:eval,num_files:1,total_bytes:78922753,xor_checksum:1631281912,sum_checksum:1631281912', '_beam_pipeline_args': []})/ReadFromTFRecord/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(InputToRecord[train]/_ImportSerializedRecord({'output_file_format': 5, 'input_config': '{\n "splits": [\n {\n "name": "train",\n "pattern": "train/*"\n },\n {\n "name": "eval",\n "pattern": "eval/*"\n }\n ]\n}', 'input_base': 'gs://ml-pipeline-proto-kubeflowpipelines-default/input_dataset/', 'output_data_format': 6, 'output_config': '{}', 'span': 0, 'version': None, 'input_fingerprint': 'split:train,num_files:1,total_bytes:662869926,xor_checksum:1631516430,sum_checksum:1631516430\nsplit:eval,num_files:1,total_bytes:78922753,xor_checksum:1631281912,sum_checksum:1631281912', '_beam_pipeline_args': []})/ReadFromTFRecord/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_2_split/Write)
  368. INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
  369. INFO:apache_beam.io.gcp.gcsio:Finished listing 2 files in 0.1442732810974121 seconds.
  370. INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
  371. INFO:apache_beam.io.gcp.gcsio:Finished listing 2 files in 0.16584086418151855 seconds.
  372. INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((((ref_PCollection_PCollection_2_split/Read)+(InputToRecord[train]/_ImportSerializedRecord({'output_file_format': 5, 'input_config': '{\n "splits": [\n {\n "name": "train",\n "pattern": "train/*"\n },\n {\n "name": "eval",\n "pattern": "eval/*"\n }\n ]\n}', 'input_base': 'gs://ml-pipeline-proto-kubeflowpipelines-default/input_dataset/', 'output_data_format': 6, 'output_config': '{}', 'span': 0, 'version': None, 'input_fingerprint': 'split:train,num_files:1,total_bytes:662869926,xor_checksum:1631516430,sum_checksum:1631516430\nsplit:eval,num_files:1,total_bytes:78922753,xor_checksum:1631281912,sum_checksum:1631281912', '_beam_pipeline_args': []})/ReadFromTFRecord/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/Process))+(ref_AppliedPTransform_InputToRecord-train-ToTFExample_10))+(ref_AppliedPTransform_WriteSplit-train-MaybeSerialize_21))+(ref_AppliedPTransform_WriteSplit-train-Shuffle-AddRandomKeys_23))+(ref_AppliedPTransform_WriteSplit-train-Shuffle-ReshufflePerKey-Map-reify_timestamps-_25))+(WriteSplit[train]/Shuffle/ReshufflePerKey/GroupByKey/Write)
  373. INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((ref_AppliedPTransform_InputToRecord-eval-_ImportSerializedRecord-output_file_format-5-input_config-n_15)+(ref_AppliedPTransform_InputToRecord-eval-_ImportSerializedRecord-output_file_format-5-input_config-n_16))+(InputToRecord[eval]/_ImportSerializedRecord({'output_file_format': 5, 'input_config': '{\n "splits": [\n {\n "name": "train",\n "pattern": "train/*"\n },\n {\n "name": "eval",\n "pattern": "eval/*"\n }\n ]\n}', 'input_base': 'gs://ml-pipeline-proto-kubeflowpipelines-default/input_dataset/', 'output_data_format': 6, 'output_config': '{}', 'span': 0, 'version': None, 'input_fingerprint': 'split:train,num_files:1,total_bytes:662869926,xor_checksum:1631516430,sum_checksum:1631516430\nsplit:eval,num_files:1,total_bytes:78922753,xor_checksum:1631281912,sum_checksum:1631281912', '_beam_pipeline_args': []})/ReadFromTFRecord/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(InputToRecord[eval]/_ImportSerializedRecord({'output_file_format': 5, 'input_config': '{\n "splits": [\n {\n "name": "train",\n "pattern": "train/*"\n },\n {\n "name": "eval",\n "pattern": "eval/*"\n }\n ]\n}', 'input_base': 'gs://ml-pipeline-proto-kubeflowpipelines-default/input_dataset/', 'output_data_format': 6, 'output_config': '{}', 'span': 0, 'version': None, 'input_fingerprint': 'split:train,num_files:1,total_bytes:662869926,xor_checksum:1631516430,sum_checksum:1631516430\nsplit:eval,num_files:1,total_bytes:78922753,xor_checksum:1631281912,sum_checksum:1631281912', '_beam_pipeline_args': []})/ReadFromTFRecord/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_6_split/Write)
  374. INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
  375. INFO:apache_beam.io.gcp.gcsio:Finished listing 2 files in 0.1833667755126953 seconds.
  376. INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
  377. INFO:apache_beam.io.gcp.gcsio:Finished listing 2 files in 0.15505599975585938 seconds.
  378. INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((((ref_PCollection_PCollection_6_split/Read)+(InputToRecord[eval]/_ImportSerializedRecord({'output_file_format': 5, 'input_config': '{\n "splits": [\n {\n "name": "train",\n "pattern": "train/*"\n },\n {\n "name": "eval",\n "pattern": "eval/*"\n }\n ]\n}', 'input_base': 'gs://ml-pipeline-proto-kubeflowpipelines-default/input_dataset/', 'output_data_format': 6, 'output_config': '{}', 'span': 0, 'version': None, 'input_fingerprint': 'split:train,num_files:1,total_bytes:662869926,xor_checksum:1631516430,sum_checksum:1631516430\nsplit:eval,num_files:1,total_bytes:78922753,xor_checksum:1631281912,sum_checksum:1631281912', '_beam_pipeline_args': []})/ReadFromTFRecord/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/Process))+(ref_AppliedPTransform_InputToRecord-eval-ToTFExample_19))+(ref_AppliedPTransform_WriteSplit-eval-MaybeSerialize_46))+(ref_AppliedPTransform_WriteSplit-eval-Shuffle-AddRandomKeys_48))+(ref_AppliedPTransform_WriteSplit-eval-Shuffle-ReshufflePerKey-Map-reify_timestamps-_50))+(WriteSplit[eval]/Shuffle/ReshufflePerKey/GroupByKey/Write)
  379. INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((ref_AppliedPTransform_WriteSplit-eval-Write-Write-WriteImpl-DoOnce-Impulse_58)+(ref_AppliedPTransform_WriteSplit-eval-Write-Write-WriteImpl-DoOnce-FlatMap-lambda-at-core-py-2979-_59))+(ref_AppliedPTransform_WriteSplit-eval-Write-Write-WriteImpl-DoOnce-Map-decode-_61))+(ref_AppliedPTransform_WriteSplit-eval-Write-Write-WriteImpl-InitializeWrite_62))+(ref_PCollection_PCollection_34/Write))+(ref_PCollection_PCollection_35/Write)
  380. INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((((WriteSplit[eval]/Shuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_WriteSplit-eval-Shuffle-ReshufflePerKey-FlatMap-restore_timestamps-_52))+(ref_AppliedPTransform_WriteSplit-eval-Shuffle-RemoveRandomKeys_53))+(ref_AppliedPTransform_WriteSplit-eval-Write-Write-WriteImpl-WindowInto-WindowIntoFn-_63))+(ref_AppliedPTransform_WriteSplit-eval-Write-Write-WriteImpl-WriteBundles_64))+(ref_AppliedPTransform_WriteSplit-eval-Write-Write-WriteImpl-Pair_65))+(WriteSplit[eval]/Write/Write/WriteImpl/GroupByKey/Write)
  381. INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((WriteSplit[eval]/Write/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteSplit-eval-Write-Write-WriteImpl-Extract_67))+(ref_PCollection_PCollection_40/Write)
  382. INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((ref_PCollection_PCollection_34/Read)+(ref_AppliedPTransform_WriteSplit-eval-Write-Write-WriteImpl-PreFinalize_68))+(ref_PCollection_PCollection_41/Write)
  383. INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
  384. INFO:apache_beam.io.gcp.gcsio:Finished listing 0 files in 0.15652847290039062 seconds.
  385. INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (ref_PCollection_PCollection_34/Read)+(ref_AppliedPTransform_WriteSplit-eval-Write-Write-WriteImpl-FinalizeWrite_69)
  386. INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
  387. INFO:apache_beam.io.gcp.gcsio:Finished listing 1 files in 0.15037989616394043 seconds.
  388. INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
  389. INFO:apache_beam.io.gcp.gcsio:Finished listing 0 files in 0.15564775466918945 seconds.
  390. INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
  391. INFO:apache_beam.io.filebasedsink:Renamed 1 shards in 0.71 seconds.
  392. INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((ref_AppliedPTransform_WriteSplit-train-Write-Write-WriteImpl-DoOnce-Impulse_33)+(ref_AppliedPTransform_WriteSplit-train-Write-Write-WriteImpl-DoOnce-FlatMap-lambda-at-core-py-2979-_34))+(ref_AppliedPTransform_WriteSplit-train-Write-Write-WriteImpl-DoOnce-Map-decode-_36))+(ref_AppliedPTransform_WriteSplit-train-Write-Write-WriteImpl-InitializeWrite_37))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)
  393. INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((((WriteSplit[train]/Shuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_WriteSplit-train-Shuffle-ReshufflePerKey-FlatMap-restore_timestamps-_27))+(ref_AppliedPTransform_WriteSplit-train-Shuffle-RemoveRandomKeys_28))+(ref_AppliedPTransform_WriteSplit-train-Write-Write-WriteImpl-WindowInto-WindowIntoFn-_38))+(ref_AppliedPTransform_WriteSplit-train-Write-Write-WriteImpl-WriteBundles_39))+(ref_AppliedPTransform_WriteSplit-train-Write-Write-WriteImpl-Pair_40))+(WriteSplit[train]/Write/Write/WriteImpl/GroupByKey/Write)
Add Comment
Please, Sign In to add comment