Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- 2021-09-13 15:24:04.008331: I tensorflow/stream_executor/platform/default/dso_loader.cc:53] Successfully opened dynamic library libcudart.so.11.0
- WARNING:absl:metadata_connection_config is not provided by IR.
- INFO:absl:tensorflow_ranking is not available: No module named 'tensorflow_ranking'
- INFO:absl:tensorflow_text is not available: No module named 'tensorflow_text'
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:apache_beam.typehints.native_type_compatibility:Using Any for unsupported type: typing.MutableMapping[str, typing.Any]
- INFO:absl:tensorflow_text is not available: No module named 'tensorflow_text'
- INFO:root:Component ImportExampleGen is running.
- INFO:absl:Running launcher for node_info {
- type {
- name: "tfx.components.example_gen.import_example_gen.component.ImportExampleGen"
- }
- id: "ImportExampleGen"
- }
- contexts {
- contexts {
- type {
- name: "pipeline"
- }
- name {
- field_value {
- string_value: "kubeflow-pipelines-1"
- }
- }
- }
- contexts {
- type {
- name: "pipeline_run"
- }
- name {
- field_value {
- string_value: "kubeflow-pipelines-1-c8t8c"
- }
- }
- }
- contexts {
- type {
- name: "node"
- }
- name {
- field_value {
- string_value: "kubeflow-pipelines-1.ImportExampleGen"
- }
- }
- }
- }
- outputs {
- outputs {
- key: "examples"
- value {
- artifact_spec {
- type {
- name: "Examples"
- properties {
- key: "span"
- value: INT
- }
- properties {
- key: "split_names"
- value: STRING
- }
- properties {
- key: "version"
- value: INT
- }
- }
- }
- }
- }
- }
- parameters {
- parameters {
- key: "input_base"
- value {
- field_value {
- string_value: "gs://ml-pipeline-proto-kubeflowpipelines-default/input_dataset/"
- }
- }
- }
- parameters {
- key: "input_config"
- value {
- field_value {
- string_value: "{\n \"splits\": [\n {\n \"name\": \"train\",\n \"pattern\": \"train/*\"\n },\n {\n \"name\": \"eval\",\n \"pattern\": \"eval/*\"\n }\n ]\n}"
- }
- }
- }
- parameters {
- key: "output_config"
- value {
- field_value {
- string_value: "{}"
- }
- }
- }
- parameters {
- key: "output_data_format"
- value {
- field_value {
- int_value: 6
- }
- }
- }
- parameters {
- key: "output_file_format"
- value {
- field_value {
- int_value: 5
- }
- }
- }
- }
- execution_options {
- caching_options {
- }
- }
- INFO:absl:MetadataStore with gRPC connection initialized
- INFO:root:Adding KFP pod name kubeflow-pipelines-1-c8t8c-2322982872 to execution
- INFO:absl:select span and version = (0, None)
- INFO:absl:latest span and version = (0, None)
- INFO:absl:select span and version = (0, None)
- INFO:absl:latest span and version = (0, None)
- INFO:absl:MetadataStore with gRPC connection initialized
- INFO:absl:Going to run a new execution 3
- INFO:absl:Going to run a new execution: ExecutionInfo(execution_id=3, input_dict={}, output_dict=defaultdict(<class 'list'>, {'examples': [Artifact(artifact: uri: "gs://ml-pipeline-proto-kubeflowpipelines-default/tfx_pipeline_output/kubeflow-pipelines-1/ImportExampleGen/examples/3"
- custom_properties {
- key: "input_fingerprint"
- value {
- string_value: "split:train,num_files:1,total_bytes:662869926,xor_checksum:1631516430,sum_checksum:1631516430\nsplit:eval,num_files:1,total_bytes:78922753,xor_checksum:1631281912,sum_checksum:1631281912"
- }
- }
- custom_properties {
- key: "name"
- value {
- string_value: "kubeflow-pipelines-1:kubeflow-pipelines-1-c8t8c:ImportExampleGen:examples:0"
- }
- }
- custom_properties {
- key: "span"
- value {
- int_value: 0
- }
- }
- , artifact_type: name: "Examples"
- properties {
- key: "span"
- value: INT
- }
- properties {
- key: "split_names"
- value: STRING
- }
- properties {
- key: "version"
- value: INT
- }
- )]}), exec_properties={'output_file_format': 5, 'input_config': '{\n "splits": [\n {\n "name": "train",\n "pattern": "train/*"\n },\n {\n "name": "eval",\n "pattern": "eval/*"\n }\n ]\n}', 'input_base': 'gs://ml-pipeline-proto-kubeflowpipelines-default/input_dataset/', 'output_data_format': 6, 'output_config': '{}', 'span': 0, 'version': None, 'input_fingerprint': 'split:train,num_files:1,total_bytes:662869926,xor_checksum:1631516430,sum_checksum:1631516430\nsplit:eval,num_files:1,total_bytes:78922753,xor_checksum:1631281912,sum_checksum:1631281912'}, execution_output_uri='gs://ml-pipeline-proto-kubeflowpipelines-default/tfx_pipeline_output/kubeflow-pipelines-1/ImportExampleGen/.system/executor_execution/3/executor_output.pb', stateful_working_dir='gs://ml-pipeline-proto-kubeflowpipelines-default/tfx_pipeline_output/kubeflow-pipelines-1/ImportExampleGen/.system/stateful_working_dir/kubeflow-pipelines-1-c8t8c', tmp_dir='gs://ml-pipeline-proto-kubeflowpipelines-default/tfx_pipeline_output/kubeflow-pipelines-1/ImportExampleGen/.system/executor_execution/3/.temp/', pipeline_node=node_info {
- type {
- name: "tfx.components.example_gen.import_example_gen.component.ImportExampleGen"
- }
- id: "ImportExampleGen"
- }
- contexts {
- contexts {
- type {
- name: "pipeline"
- }
- name {
- field_value {
- string_value: "kubeflow-pipelines-1"
- }
- }
- }
- contexts {
- type {
- name: "pipeline_run"
- }
- name {
- field_value {
- string_value: "kubeflow-pipelines-1-c8t8c"
- }
- }
- }
- contexts {
- type {
- name: "node"
- }
- name {
- field_value {
- string_value: "kubeflow-pipelines-1.ImportExampleGen"
- }
- }
- }
- }
- outputs {
- outputs {
- key: "examples"
- value {
- artifact_spec {
- type {
- name: "Examples"
- properties {
- key: "span"
- value: INT
- }
- properties {
- key: "split_names"
- value: STRING
- }
- properties {
- key: "version"
- value: INT
- }
- }
- }
- }
- }
- }
- parameters {
- parameters {
- key: "input_base"
- value {
- field_value {
- string_value: "gs://ml-pipeline-proto-kubeflowpipelines-default/input_dataset/"
- }
- }
- }
- parameters {
- key: "input_config"
- value {
- field_value {
- string_value: "{\n \"splits\": [\n {\n \"name\": \"train\",\n \"pattern\": \"train/*\"\n },\n {\n \"name\": \"eval\",\n \"pattern\": \"eval/*\"\n }\n ]\n}"
- }
- }
- }
- parameters {
- key: "output_config"
- value {
- field_value {
- string_value: "{}"
- }
- }
- }
- parameters {
- key: "output_data_format"
- value {
- field_value {
- int_value: 6
- }
- }
- }
- parameters {
- key: "output_file_format"
- value {
- field_value {
- int_value: 5
- }
- }
- }
- }
- execution_options {
- caching_options {
- }
- }
- , pipeline_info=id: "kubeflow-pipelines-1"
- , pipeline_run_id='kubeflow-pipelines-1-c8t8c')
- INFO:absl:Generating examples.
- INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner.
- INFO:absl:Reading input TFRecord data gs://ml-pipeline-proto-kubeflowpipelines-default/input_dataset/train/*.
- INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
- INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
- INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
- INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
- INFO:apache_beam.io.gcp.gcsio:Finished listing 2 files in 0.33609795570373535 seconds.
- INFO:absl:Reading input TFRecord data gs://ml-pipeline-proto-kubeflowpipelines-default/input_dataset/eval/*.
- INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
- INFO:apache_beam.io.gcp.gcsio:Finished listing 2 files in 0.14176344871520996 seconds.
- WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter.
- INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.31.0
- INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function annotate_downstream_side_inputs at 0x7f4b40db47a0> ====================
- INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function fix_side_input_pcoll_coders at 0x7f4b40db48c0> ====================
- INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7f4b40db4d40> ====================
- INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7f4b40db4dd0> ====================
- INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function expand_sdf at 0x7f4b40db4f80> ====================
- INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function expand_gbk at 0x7f4b40daf050> ====================
- INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sink_flattens at 0x7f4b40daf170> ====================
- INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function greedily_fuse at 0x7f4b40daf200> ====================
- INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function read_to_impulse at 0x7f4b40daf290> ====================
- INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function impulse_to_input at 0x7f4b40daf320> ====================
- INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7f4b40daf560> ====================
- INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function setup_timer_mapping at 0x7f4b40daf4d0> ====================
- INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function populate_data_channel_coders at 0x7f4b40daf5f0> ====================
- INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
- INFO:apache_beam.runners.portability.fn_api_runner.worker_handlers:Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler object at 0x7f4b3d71a290> for environment ref_Environment_default_environment_1 (beam:env:embedded_python:v1, b'')
- INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((ref_AppliedPTransform_InputToRecord-train-_ImportSerializedRecord-output_file_format-5-input_config-_6)+(ref_AppliedPTransform_InputToRecord-train-_ImportSerializedRecord-output_file_format-5-input_config-_7))+(InputToRecord[train]/_ImportSerializedRecord({'output_file_format': 5, 'input_config': '{\n "splits": [\n {\n "name": "train",\n "pattern": "train/*"\n },\n {\n "name": "eval",\n "pattern": "eval/*"\n }\n ]\n}', 'input_base': 'gs://ml-pipeline-proto-kubeflowpipelines-default/input_dataset/', 'output_data_format': 6, 'output_config': '{}', 'span': 0, 'version': None, 'input_fingerprint': 'split:train,num_files:1,total_bytes:662869926,xor_checksum:1631516430,sum_checksum:1631516430\nsplit:eval,num_files:1,total_bytes:78922753,xor_checksum:1631281912,sum_checksum:1631281912', '_beam_pipeline_args': []})/ReadFromTFRecord/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(InputToRecord[train]/_ImportSerializedRecord({'output_file_format': 5, 'input_config': '{\n "splits": [\n {\n "name": "train",\n "pattern": "train/*"\n },\n {\n "name": "eval",\n "pattern": "eval/*"\n }\n ]\n}', 'input_base': 'gs://ml-pipeline-proto-kubeflowpipelines-default/input_dataset/', 'output_data_format': 6, 'output_config': '{}', 'span': 0, 'version': None, 'input_fingerprint': 'split:train,num_files:1,total_bytes:662869926,xor_checksum:1631516430,sum_checksum:1631516430\nsplit:eval,num_files:1,total_bytes:78922753,xor_checksum:1631281912,sum_checksum:1631281912', '_beam_pipeline_args': []})/ReadFromTFRecord/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_2_split/Write)
- INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
- INFO:apache_beam.io.gcp.gcsio:Finished listing 2 files in 0.1442732810974121 seconds.
- INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
- INFO:apache_beam.io.gcp.gcsio:Finished listing 2 files in 0.16584086418151855 seconds.
- INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((((ref_PCollection_PCollection_2_split/Read)+(InputToRecord[train]/_ImportSerializedRecord({'output_file_format': 5, 'input_config': '{\n "splits": [\n {\n "name": "train",\n "pattern": "train/*"\n },\n {\n "name": "eval",\n "pattern": "eval/*"\n }\n ]\n}', 'input_base': 'gs://ml-pipeline-proto-kubeflowpipelines-default/input_dataset/', 'output_data_format': 6, 'output_config': '{}', 'span': 0, 'version': None, 'input_fingerprint': 'split:train,num_files:1,total_bytes:662869926,xor_checksum:1631516430,sum_checksum:1631516430\nsplit:eval,num_files:1,total_bytes:78922753,xor_checksum:1631281912,sum_checksum:1631281912', '_beam_pipeline_args': []})/ReadFromTFRecord/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/Process))+(ref_AppliedPTransform_InputToRecord-train-ToTFExample_10))+(ref_AppliedPTransform_WriteSplit-train-MaybeSerialize_21))+(ref_AppliedPTransform_WriteSplit-train-Shuffle-AddRandomKeys_23))+(ref_AppliedPTransform_WriteSplit-train-Shuffle-ReshufflePerKey-Map-reify_timestamps-_25))+(WriteSplit[train]/Shuffle/ReshufflePerKey/GroupByKey/Write)
- INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((ref_AppliedPTransform_InputToRecord-eval-_ImportSerializedRecord-output_file_format-5-input_config-n_15)+(ref_AppliedPTransform_InputToRecord-eval-_ImportSerializedRecord-output_file_format-5-input_config-n_16))+(InputToRecord[eval]/_ImportSerializedRecord({'output_file_format': 5, 'input_config': '{\n "splits": [\n {\n "name": "train",\n "pattern": "train/*"\n },\n {\n "name": "eval",\n "pattern": "eval/*"\n }\n ]\n}', 'input_base': 'gs://ml-pipeline-proto-kubeflowpipelines-default/input_dataset/', 'output_data_format': 6, 'output_config': '{}', 'span': 0, 'version': None, 'input_fingerprint': 'split:train,num_files:1,total_bytes:662869926,xor_checksum:1631516430,sum_checksum:1631516430\nsplit:eval,num_files:1,total_bytes:78922753,xor_checksum:1631281912,sum_checksum:1631281912', '_beam_pipeline_args': []})/ReadFromTFRecord/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(InputToRecord[eval]/_ImportSerializedRecord({'output_file_format': 5, 'input_config': '{\n "splits": [\n {\n "name": "train",\n "pattern": "train/*"\n },\n {\n "name": "eval",\n "pattern": "eval/*"\n }\n ]\n}', 'input_base': 'gs://ml-pipeline-proto-kubeflowpipelines-default/input_dataset/', 'output_data_format': 6, 'output_config': '{}', 'span': 0, 'version': None, 'input_fingerprint': 'split:train,num_files:1,total_bytes:662869926,xor_checksum:1631516430,sum_checksum:1631516430\nsplit:eval,num_files:1,total_bytes:78922753,xor_checksum:1631281912,sum_checksum:1631281912', '_beam_pipeline_args': []})/ReadFromTFRecord/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_6_split/Write)
- INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
- INFO:apache_beam.io.gcp.gcsio:Finished listing 2 files in 0.1833667755126953 seconds.
- INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
- INFO:apache_beam.io.gcp.gcsio:Finished listing 2 files in 0.15505599975585938 seconds.
- INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((((ref_PCollection_PCollection_6_split/Read)+(InputToRecord[eval]/_ImportSerializedRecord({'output_file_format': 5, 'input_config': '{\n "splits": [\n {\n "name": "train",\n "pattern": "train/*"\n },\n {\n "name": "eval",\n "pattern": "eval/*"\n }\n ]\n}', 'input_base': 'gs://ml-pipeline-proto-kubeflowpipelines-default/input_dataset/', 'output_data_format': 6, 'output_config': '{}', 'span': 0, 'version': None, 'input_fingerprint': 'split:train,num_files:1,total_bytes:662869926,xor_checksum:1631516430,sum_checksum:1631516430\nsplit:eval,num_files:1,total_bytes:78922753,xor_checksum:1631281912,sum_checksum:1631281912', '_beam_pipeline_args': []})/ReadFromTFRecord/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/Process))+(ref_AppliedPTransform_InputToRecord-eval-ToTFExample_19))+(ref_AppliedPTransform_WriteSplit-eval-MaybeSerialize_46))+(ref_AppliedPTransform_WriteSplit-eval-Shuffle-AddRandomKeys_48))+(ref_AppliedPTransform_WriteSplit-eval-Shuffle-ReshufflePerKey-Map-reify_timestamps-_50))+(WriteSplit[eval]/Shuffle/ReshufflePerKey/GroupByKey/Write)
- INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((ref_AppliedPTransform_WriteSplit-eval-Write-Write-WriteImpl-DoOnce-Impulse_58)+(ref_AppliedPTransform_WriteSplit-eval-Write-Write-WriteImpl-DoOnce-FlatMap-lambda-at-core-py-2979-_59))+(ref_AppliedPTransform_WriteSplit-eval-Write-Write-WriteImpl-DoOnce-Map-decode-_61))+(ref_AppliedPTransform_WriteSplit-eval-Write-Write-WriteImpl-InitializeWrite_62))+(ref_PCollection_PCollection_34/Write))+(ref_PCollection_PCollection_35/Write)
- INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((((WriteSplit[eval]/Shuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_WriteSplit-eval-Shuffle-ReshufflePerKey-FlatMap-restore_timestamps-_52))+(ref_AppliedPTransform_WriteSplit-eval-Shuffle-RemoveRandomKeys_53))+(ref_AppliedPTransform_WriteSplit-eval-Write-Write-WriteImpl-WindowInto-WindowIntoFn-_63))+(ref_AppliedPTransform_WriteSplit-eval-Write-Write-WriteImpl-WriteBundles_64))+(ref_AppliedPTransform_WriteSplit-eval-Write-Write-WriteImpl-Pair_65))+(WriteSplit[eval]/Write/Write/WriteImpl/GroupByKey/Write)
- INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((WriteSplit[eval]/Write/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteSplit-eval-Write-Write-WriteImpl-Extract_67))+(ref_PCollection_PCollection_40/Write)
- INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((ref_PCollection_PCollection_34/Read)+(ref_AppliedPTransform_WriteSplit-eval-Write-Write-WriteImpl-PreFinalize_68))+(ref_PCollection_PCollection_41/Write)
- INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
- INFO:apache_beam.io.gcp.gcsio:Finished listing 0 files in 0.15652847290039062 seconds.
- INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (ref_PCollection_PCollection_34/Read)+(ref_AppliedPTransform_WriteSplit-eval-Write-Write-WriteImpl-FinalizeWrite_69)
- INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
- INFO:apache_beam.io.gcp.gcsio:Finished listing 1 files in 0.15037989616394043 seconds.
- INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
- INFO:apache_beam.io.gcp.gcsio:Finished listing 0 files in 0.15564775466918945 seconds.
- INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1
- INFO:apache_beam.io.filebasedsink:Renamed 1 shards in 0.71 seconds.
- INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((ref_AppliedPTransform_WriteSplit-train-Write-Write-WriteImpl-DoOnce-Impulse_33)+(ref_AppliedPTransform_WriteSplit-train-Write-Write-WriteImpl-DoOnce-FlatMap-lambda-at-core-py-2979-_34))+(ref_AppliedPTransform_WriteSplit-train-Write-Write-WriteImpl-DoOnce-Map-decode-_36))+(ref_AppliedPTransform_WriteSplit-train-Write-Write-WriteImpl-InitializeWrite_37))+(ref_PCollection_PCollection_17/Write))+(ref_PCollection_PCollection_18/Write)
- INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((((((WriteSplit[train]/Shuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_WriteSplit-train-Shuffle-ReshufflePerKey-FlatMap-restore_timestamps-_27))+(ref_AppliedPTransform_WriteSplit-train-Shuffle-RemoveRandomKeys_28))+(ref_AppliedPTransform_WriteSplit-train-Write-Write-WriteImpl-WindowInto-WindowIntoFn-_38))+(ref_AppliedPTransform_WriteSplit-train-Write-Write-WriteImpl-WriteBundles_39))+(ref_AppliedPTransform_WriteSplit-train-Write-Write-WriteImpl-Pair_40))+(WriteSplit[train]/Write/Write/WriteImpl/GroupByKey/Write)
Add Comment
Please, Sign In to add comment