Advertisement
Guest User

Untitled

a guest
Jul 25th, 2021
520
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 55.98 KB | None | 0 0
  1. *** Reading local file: /opt/airflow/logs/spark/spark-job/2021-07-25T19:24:21.538079+00:00/1.log
  2. [2021-07-25 19:24:23,514] {taskinstance.py:877} INFO - Dependencies all met for <TaskInstance: spark.spark-job 2021-07-25T19:24:21.538079+00:00 [queued]>
  3. [2021-07-25 19:24:23,556] {taskinstance.py:877} INFO - Dependencies all met for <TaskInstance: spark.spark-job 2021-07-25T19:24:21.538079+00:00 [queued]>
  4. [2021-07-25 19:24:23,557] {taskinstance.py:1068} INFO -
  5. --------------------------------------------------------------------------------
  6. [2021-07-25 19:24:23,557] {taskinstance.py:1069} INFO - Starting attempt 1 of 4
  7. [2021-07-25 19:24:23,557] {taskinstance.py:1070} INFO -
  8. --------------------------------------------------------------------------------
  9. [2021-07-25 19:24:23,576] {taskinstance.py:1089} INFO - Executing <Task(SparkSubmitOperator): spark-job> on 2021-07-25T19:24:21.538079+00:00
  10. [2021-07-25 19:24:23,585] {standard_task_runner.py:52} INFO - Started process 204 to run task
  11. [2021-07-25 19:24:23,604] {standard_task_runner.py:76} INFO - Running: ['airflow', 'tasks', 'run', 'spark', 'spark-job', '2021-07-25T19:24:21.538079+00:00', '--job-id', '76', '--pool', 'default_pool', '--raw', '--subdir', 'DAGS_FOLDER/dag.py', '--cfg-path', '/tmp/tmpgfkq2tbw', '--error-file', '/tmp/tmp8bd07z_t']
  12. [2021-07-25 19:24:23,604] {standard_task_runner.py:77} INFO - Job 76: Subtask spark-job
  13. [2021-07-25 19:24:23,730] {logging_mixin.py:104} INFO - Running <TaskInstance: spark.spark-job 2021-07-25T19:24:21.538079+00:00 [running]> on host 1506e808d44e
  14. [2021-07-25 19:24:23,844] {taskinstance.py:1283} INFO - Exporting the following env vars:
  15. AIRFLOW_CTX_DAG_OWNER=daniel
  16. AIRFLOW_CTX_DAG_ID=spark
  17. AIRFLOW_CTX_TASK_ID=spark-job
  18. AIRFLOW_CTX_EXECUTION_DATE=2021-07-25T19:24:21.538079+00:00
  19. AIRFLOW_CTX_DAG_RUN_ID=manual__2021-07-25T19:24:21.538079+00:00
  20. [2021-07-25 19:24:23,863] {base.py:78} INFO - Using connection to: id: spark_default. Host: spark://spark, Port: 8080, Schema: , Login: airflow, Password: XXXXXXXX, extra: None
  21. [2021-07-25 19:24:23,871] {spark_submit.py:364} INFO - Spark-Submit cmd: spark-submit --master spark://spark:8080 --packages org.apache.spark:spark-sql-kafka-0-10_2.12:3.1.2,postgresql:postgresql:9.1-901-1.jdbc4 --name arrow-spark /opt/airflow/dags/send_to_postgres.py
  22. [2021-07-25 19:24:29,256] {spark_submit.py:526} INFO - WARNING: An illegal reflective access operation has occurred
  23. [2021-07-25 19:24:29,257] {spark_submit.py:526} INFO - WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/home/airflow/.local/lib/python3.6/site-packages/pyspark/jars/spark-unsafe_2.12-3.1.1.jar) to constructor java.nio.DirectByteBuffer(long,int)
  24. [2021-07-25 19:24:29,257] {spark_submit.py:526} INFO - WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
  25. [2021-07-25 19:24:29,257] {spark_submit.py:526} INFO - WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
  26. [2021-07-25 19:24:29,261] {spark_submit.py:526} INFO - WARNING: All illegal access operations will be denied in a future release
  27. [2021-07-25 19:24:30,009] {spark_submit.py:526} INFO - :: loading settings :: url = jar:file:/home/airflow/.local/lib/python3.6/site-packages/pyspark/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
  28. [2021-07-25 19:24:30,404] {spark_submit.py:526} INFO - Ivy Default Cache set to: /home/airflow/.ivy2/cache
  29. [2021-07-25 19:24:30,405] {spark_submit.py:526} INFO - The jars for the packages stored in: /home/airflow/.ivy2/jars
  30. [2021-07-25 19:24:30,420] {spark_submit.py:526} INFO - org.apache.spark#spark-sql-kafka-0-10_2.12 added as a dependency
  31. [2021-07-25 19:24:30,420] {spark_submit.py:526} INFO - postgresql#postgresql added as a dependency
  32. [2021-07-25 19:24:30,424] {spark_submit.py:526} INFO - :: resolving dependencies :: org.apache.spark#spark-submit-parent-171e76cd-df32-41b8-b08c-fe50e77f059a;1.0
  33. [2021-07-25 19:24:30,424] {spark_submit.py:526} INFO - confs: [default]
  34. [2021-07-25 19:24:31,124] {spark_submit.py:526} INFO - found org.apache.spark#spark-sql-kafka-0-10_2.12;3.1.2 in central
  35. [2021-07-25 19:24:31,507] {spark_submit.py:526} INFO - found org.apache.spark#spark-token-provider-kafka-0-10_2.12;3.1.2 in central
  36. [2021-07-25 19:24:31,775] {spark_submit.py:526} INFO - found org.apache.kafka#kafka-clients;2.6.0 in central
  37. [2021-07-25 19:24:31,879] {spark_submit.py:526} INFO - found com.github.luben#zstd-jni;1.4.8-1 in central
  38. [2021-07-25 19:24:32,043] {spark_submit.py:526} INFO - found org.lz4#lz4-java;1.7.1 in central
  39. [2021-07-25 19:24:32,123] {spark_submit.py:526} INFO - found org.xerial.snappy#snappy-java;1.1.8.2 in central
  40. [2021-07-25 19:24:32,186] {spark_submit.py:526} INFO - found org.slf4j#slf4j-api;1.7.30 in central
  41. [2021-07-25 19:24:32,244] {spark_submit.py:526} INFO - found org.spark-project.spark#unused;1.0.0 in central
  42. [2021-07-25 19:24:32,302] {spark_submit.py:526} INFO - found org.apache.commons#commons-pool2;2.6.2 in central
  43. [2021-07-25 19:24:32,347] {spark_submit.py:526} INFO - found postgresql#postgresql;9.1-901-1.jdbc4 in central
  44. [2021-07-25 19:24:32,426] {spark_submit.py:526} INFO - :: resolution report :: resolve 1957ms :: artifacts dl 44ms
  45. [2021-07-25 19:24:32,427] {spark_submit.py:526} INFO - :: modules in use:
  46. [2021-07-25 19:24:32,432] {spark_submit.py:526} INFO - com.github.luben#zstd-jni;1.4.8-1 from central in [default]
  47. [2021-07-25 19:24:32,433] {spark_submit.py:526} INFO - org.apache.commons#commons-pool2;2.6.2 from central in [default]
  48. [2021-07-25 19:24:32,433] {spark_submit.py:526} INFO - org.apache.kafka#kafka-clients;2.6.0 from central in [default]
  49. [2021-07-25 19:24:32,434] {spark_submit.py:526} INFO - org.apache.spark#spark-sql-kafka-0-10_2.12;3.1.2 from central in [default]
  50. [2021-07-25 19:24:32,434] {spark_submit.py:526} INFO - org.apache.spark#spark-token-provider-kafka-0-10_2.12;3.1.2 from central in [default]
  51. [2021-07-25 19:24:32,436] {spark_submit.py:526} INFO - org.lz4#lz4-java;1.7.1 from central in [default]
  52. [2021-07-25 19:24:32,436] {spark_submit.py:526} INFO - org.slf4j#slf4j-api;1.7.30 from central in [default]
  53. [2021-07-25 19:24:32,436] {spark_submit.py:526} INFO - org.spark-project.spark#unused;1.0.0 from central in [default]
  54. [2021-07-25 19:24:32,436] {spark_submit.py:526} INFO - org.xerial.snappy#snappy-java;1.1.8.2 from central in [default]
  55. [2021-07-25 19:24:32,436] {spark_submit.py:526} INFO - postgresql#postgresql;9.1-901-1.jdbc4 from central in [default]
  56. [2021-07-25 19:24:32,436] {spark_submit.py:526} INFO - ---------------------------------------------------------------------
  57. [2021-07-25 19:24:32,437] {spark_submit.py:526} INFO - | | modules || artifacts |
  58. [2021-07-25 19:24:32,437] {spark_submit.py:526} INFO - | conf | number| search|dwnlded|evicted|| number|dwnlded|
  59. [2021-07-25 19:24:32,437] {spark_submit.py:526} INFO - ---------------------------------------------------------------------
  60. [2021-07-25 19:24:32,437] {spark_submit.py:526} INFO - | default | 10 | 0 | 0 | 0 || 10 | 0 |
  61. [2021-07-25 19:24:32,437] {spark_submit.py:526} INFO - ---------------------------------------------------------------------
  62. [2021-07-25 19:24:32,458] {spark_submit.py:526} INFO - :: retrieving :: org.apache.spark#spark-submit-parent-171e76cd-df32-41b8-b08c-fe50e77f059a
  63. [2021-07-25 19:24:32,459] {spark_submit.py:526} INFO - confs: [default]
  64. [2021-07-25 19:24:32,501] {spark_submit.py:526} INFO - 0 artifacts copied, 10 already retrieved (0kB/40ms)
  65. [2021-07-25 19:24:33,725] {spark_submit.py:526} INFO - 21/07/25 19:24:33 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  66. [2021-07-25 19:24:37,433] {spark_submit.py:526} INFO - Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
  67. [2021-07-25 19:24:37,461] {spark_submit.py:526} INFO - 21/07/25 19:24:37 INFO SparkContext: Running Spark version 3.1.1
  68. [2021-07-25 19:24:37,557] {spark_submit.py:526} INFO - 21/07/25 19:24:37 INFO ResourceUtils: ==============================================================
  69. [2021-07-25 19:24:37,557] {spark_submit.py:526} INFO - 21/07/25 19:24:37 INFO ResourceUtils: No custom resources configured for spark.driver.
  70. [2021-07-25 19:24:37,561] {spark_submit.py:526} INFO - 21/07/25 19:24:37 INFO ResourceUtils: ==============================================================
  71. [2021-07-25 19:24:37,561] {spark_submit.py:526} INFO - 21/07/25 19:24:37 INFO SparkContext: Submitted application: myApp
  72. [2021-07-25 19:24:37,639] {spark_submit.py:526} INFO - 21/07/25 19:24:37 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
  73. [2021-07-25 19:24:37,667] {spark_submit.py:526} INFO - 21/07/25 19:24:37 INFO ResourceProfile: Limiting resource is cpu
  74. [2021-07-25 19:24:37,671] {spark_submit.py:526} INFO - 21/07/25 19:24:37 INFO ResourceProfileManager: Added ResourceProfile id: 0
  75. [2021-07-25 19:24:37,847] {spark_submit.py:526} INFO - 21/07/25 19:24:37 INFO SecurityManager: Changing view acls to: airflow
  76. [2021-07-25 19:24:37,847] {spark_submit.py:526} INFO - 21/07/25 19:24:37 INFO SecurityManager: Changing modify acls to: airflow
  77. [2021-07-25 19:24:37,848] {spark_submit.py:526} INFO - 21/07/25 19:24:37 INFO SecurityManager: Changing view acls groups to:
  78. [2021-07-25 19:24:37,848] {spark_submit.py:526} INFO - 21/07/25 19:24:37 INFO SecurityManager: Changing modify acls groups to:
  79. [2021-07-25 19:24:37,848] {spark_submit.py:526} INFO - 21/07/25 19:24:37 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(airflow); groups with view permissions: Set(); users with modify permissions: Set(airflow); groups with modify permissions: Set()
  80. [2021-07-25 19:24:38,744] {spark_submit.py:526} INFO - 21/07/25 19:24:38 INFO Utils: Successfully started service 'sparkDriver' on port 37961.
  81. [2021-07-25 19:24:38,875] {spark_submit.py:526} INFO - 21/07/25 19:24:38 INFO SparkEnv: Registering MapOutputTracker
  82. [2021-07-25 19:24:38,976] {spark_submit.py:526} INFO - 21/07/25 19:24:38 INFO SparkEnv: Registering BlockManagerMaster
  83. [2021-07-25 19:24:39,025] {spark_submit.py:526} INFO - 21/07/25 19:24:39 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
  84. [2021-07-25 19:24:39,026] {spark_submit.py:526} INFO - 21/07/25 19:24:39 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
  85. [2021-07-25 19:24:39,044] {spark_submit.py:526} INFO - 21/07/25 19:24:39 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
  86. [2021-07-25 19:24:39,075] {spark_submit.py:526} INFO - 21/07/25 19:24:39 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-1c8a518d-1059-4948-924b-ac7a612f3f78
  87. [2021-07-25 19:24:39,134] {spark_submit.py:526} INFO - 21/07/25 19:24:39 INFO MemoryStore: MemoryStore started with capacity 434.4 MiB
  88. [2021-07-25 19:24:39,177] {spark_submit.py:526} INFO - 21/07/25 19:24:39 INFO SparkEnv: Registering OutputCommitCoordinator
  89. [2021-07-25 19:24:39,732] {spark_submit.py:526} INFO - 21/07/25 19:24:39 INFO Utils: Successfully started service 'SparkUI' on port 4040.
  90. [2021-07-25 19:24:39,914] {spark_submit.py:526} INFO - 21/07/25 19:24:39 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://1506e808d44e:4040
  91. [2021-07-25 19:24:39,957] {spark_submit.py:526} INFO - 21/07/25 19:24:39 INFO SparkContext: Added JAR file:///home/airflow/.ivy2/jars/org.apache.spark_spark-sql-kafka-0-10_2.12-3.1.2.jar at spark://1506e808d44e:37961/jars/org.apache.spark_spark-sql-kafka-0-10_2.12-3.1.2.jar with timestamp 1627241077411
  92. [2021-07-25 19:24:39,958] {spark_submit.py:526} INFO - 21/07/25 19:24:39 INFO SparkContext: Added JAR file:///home/airflow/.ivy2/jars/postgresql_postgresql-9.1-901-1.jdbc4.jar at spark://1506e808d44e:37961/jars/postgresql_postgresql-9.1-901-1.jdbc4.jar with timestamp 1627241077411
  93. [2021-07-25 19:24:39,958] {spark_submit.py:526} INFO - 21/07/25 19:24:39 INFO SparkContext: Added JAR file:///home/airflow/.ivy2/jars/org.apache.spark_spark-token-provider-kafka-0-10_2.12-3.1.2.jar at spark://1506e808d44e:37961/jars/org.apache.spark_spark-token-provider-kafka-0-10_2.12-3.1.2.jar with timestamp 1627241077411
  94. [2021-07-25 19:24:39,958] {spark_submit.py:526} INFO - 21/07/25 19:24:39 INFO SparkContext: Added JAR file:///home/airflow/.ivy2/jars/org.apache.kafka_kafka-clients-2.6.0.jar at spark://1506e808d44e:37961/jars/org.apache.kafka_kafka-clients-2.6.0.jar with timestamp 1627241077411
  95. [2021-07-25 19:24:39,958] {spark_submit.py:526} INFO - 21/07/25 19:24:39 INFO SparkContext: Added JAR file:///home/airflow/.ivy2/jars/org.apache.commons_commons-pool2-2.6.2.jar at spark://1506e808d44e:37961/jars/org.apache.commons_commons-pool2-2.6.2.jar with timestamp 1627241077411
  96. [2021-07-25 19:24:39,958] {spark_submit.py:526} INFO - 21/07/25 19:24:39 INFO SparkContext: Added JAR file:///home/airflow/.ivy2/jars/org.spark-project.spark_unused-1.0.0.jar at spark://1506e808d44e:37961/jars/org.spark-project.spark_unused-1.0.0.jar with timestamp 1627241077411
  97. [2021-07-25 19:24:39,961] {spark_submit.py:526} INFO - 21/07/25 19:24:39 INFO SparkContext: Added JAR file:///home/airflow/.ivy2/jars/com.github.luben_zstd-jni-1.4.8-1.jar at spark://1506e808d44e:37961/jars/com.github.luben_zstd-jni-1.4.8-1.jar with timestamp 1627241077411
  98. [2021-07-25 19:24:39,963] {spark_submit.py:526} INFO - 21/07/25 19:24:39 INFO SparkContext: Added JAR file:///home/airflow/.ivy2/jars/org.lz4_lz4-java-1.7.1.jar at spark://1506e808d44e:37961/jars/org.lz4_lz4-java-1.7.1.jar with timestamp 1627241077411
  99. [2021-07-25 19:24:39,963] {spark_submit.py:526} INFO - 21/07/25 19:24:39 INFO SparkContext: Added JAR file:///home/airflow/.ivy2/jars/org.xerial.snappy_snappy-java-1.1.8.2.jar at spark://1506e808d44e:37961/jars/org.xerial.snappy_snappy-java-1.1.8.2.jar with timestamp 1627241077411
  100. [2021-07-25 19:24:39,963] {spark_submit.py:526} INFO - 21/07/25 19:24:39 INFO SparkContext: Added JAR file:///home/airflow/.ivy2/jars/org.slf4j_slf4j-api-1.7.30.jar at spark://1506e808d44e:37961/jars/org.slf4j_slf4j-api-1.7.30.jar with timestamp 1627241077411
  101. [2021-07-25 19:24:39,965] {spark_submit.py:526} INFO - 21/07/25 19:24:39 INFO SparkContext: Added file file:///home/airflow/.ivy2/jars/org.apache.spark_spark-sql-kafka-0-10_2.12-3.1.2.jar at spark://1506e808d44e:37961/files/org.apache.spark_spark-sql-kafka-0-10_2.12-3.1.2.jar with timestamp 1627241077411
  102. [2021-07-25 19:24:39,971] {spark_submit.py:526} INFO - 21/07/25 19:24:39 INFO Utils: Copying /home/airflow/.ivy2/jars/org.apache.spark_spark-sql-kafka-0-10_2.12-3.1.2.jar to /tmp/spark-6d0b4ce8-4750-45f0-8074-d06e159ca333/userFiles-5865980b-a26d-4fc9-bd51-d874c5f857f4/org.apache.spark_spark-sql-kafka-0-10_2.12-3.1.2.jar
  103. [2021-07-25 19:24:40,038] {spark_submit.py:526} INFO - 21/07/25 19:24:40 INFO SparkContext: Added file file:///home/airflow/.ivy2/jars/postgresql_postgresql-9.1-901-1.jdbc4.jar at spark://1506e808d44e:37961/files/postgresql_postgresql-9.1-901-1.jdbc4.jar with timestamp 1627241077411
  104. [2021-07-25 19:24:40,038] {spark_submit.py:526} INFO - 21/07/25 19:24:40 INFO Utils: Copying /home/airflow/.ivy2/jars/postgresql_postgresql-9.1-901-1.jdbc4.jar to /tmp/spark-6d0b4ce8-4750-45f0-8074-d06e159ca333/userFiles-5865980b-a26d-4fc9-bd51-d874c5f857f4/postgresql_postgresql-9.1-901-1.jdbc4.jar
  105. [2021-07-25 19:24:40,060] {spark_submit.py:526} INFO - 21/07/25 19:24:40 INFO SparkContext: Added file file:///home/airflow/.ivy2/jars/org.apache.spark_spark-token-provider-kafka-0-10_2.12-3.1.2.jar at spark://1506e808d44e:37961/files/org.apache.spark_spark-token-provider-kafka-0-10_2.12-3.1.2.jar with timestamp 1627241077411
  106. [2021-07-25 19:24:40,060] {spark_submit.py:526} INFO - 21/07/25 19:24:40 INFO Utils: Copying /home/airflow/.ivy2/jars/org.apache.spark_spark-token-provider-kafka-0-10_2.12-3.1.2.jar to /tmp/spark-6d0b4ce8-4750-45f0-8074-d06e159ca333/userFiles-5865980b-a26d-4fc9-bd51-d874c5f857f4/org.apache.spark_spark-token-provider-kafka-0-10_2.12-3.1.2.jar
  107. [2021-07-25 19:24:40,074] {spark_submit.py:526} INFO - 21/07/25 19:24:40 INFO SparkContext: Added file file:///home/airflow/.ivy2/jars/org.apache.kafka_kafka-clients-2.6.0.jar at spark://1506e808d44e:37961/files/org.apache.kafka_kafka-clients-2.6.0.jar with timestamp 1627241077411
  108. [2021-07-25 19:24:40,074] {spark_submit.py:526} INFO - 21/07/25 19:24:40 INFO Utils: Copying /home/airflow/.ivy2/jars/org.apache.kafka_kafka-clients-2.6.0.jar to /tmp/spark-6d0b4ce8-4750-45f0-8074-d06e159ca333/userFiles-5865980b-a26d-4fc9-bd51-d874c5f857f4/org.apache.kafka_kafka-clients-2.6.0.jar
  109. [2021-07-25 19:24:40,136] {spark_submit.py:526} INFO - 21/07/25 19:24:40 INFO SparkContext: Added file file:///home/airflow/.ivy2/jars/org.apache.commons_commons-pool2-2.6.2.jar at spark://1506e808d44e:37961/files/org.apache.commons_commons-pool2-2.6.2.jar with timestamp 1627241077411
  110. [2021-07-25 19:24:40,137] {spark_submit.py:526} INFO - 21/07/25 19:24:40 INFO Utils: Copying /home/airflow/.ivy2/jars/org.apache.commons_commons-pool2-2.6.2.jar to /tmp/spark-6d0b4ce8-4750-45f0-8074-d06e159ca333/userFiles-5865980b-a26d-4fc9-bd51-d874c5f857f4/org.apache.commons_commons-pool2-2.6.2.jar
  111. [2021-07-25 19:24:40,161] {spark_submit.py:526} INFO - 21/07/25 19:24:40 INFO SparkContext: Added file file:///home/airflow/.ivy2/jars/org.spark-project.spark_unused-1.0.0.jar at spark://1506e808d44e:37961/files/org.spark-project.spark_unused-1.0.0.jar with timestamp 1627241077411
  112. [2021-07-25 19:24:40,161] {spark_submit.py:526} INFO - 21/07/25 19:24:40 INFO Utils: Copying /home/airflow/.ivy2/jars/org.spark-project.spark_unused-1.0.0.jar to /tmp/spark-6d0b4ce8-4750-45f0-8074-d06e159ca333/userFiles-5865980b-a26d-4fc9-bd51-d874c5f857f4/org.spark-project.spark_unused-1.0.0.jar
  113. [2021-07-25 19:24:40,203] {spark_submit.py:526} INFO - 21/07/25 19:24:40 INFO SparkContext: Added file file:///home/airflow/.ivy2/jars/com.github.luben_zstd-jni-1.4.8-1.jar at spark://1506e808d44e:37961/files/com.github.luben_zstd-jni-1.4.8-1.jar with timestamp 1627241077411
  114. [2021-07-25 19:24:40,203] {spark_submit.py:526} INFO - 21/07/25 19:24:40 INFO Utils: Copying /home/airflow/.ivy2/jars/com.github.luben_zstd-jni-1.4.8-1.jar to /tmp/spark-6d0b4ce8-4750-45f0-8074-d06e159ca333/userFiles-5865980b-a26d-4fc9-bd51-d874c5f857f4/com.github.luben_zstd-jni-1.4.8-1.jar
  115. [2021-07-25 19:24:40,253] {spark_submit.py:526} INFO - 21/07/25 19:24:40 INFO SparkContext: Added file file:///home/airflow/.ivy2/jars/org.lz4_lz4-java-1.7.1.jar at spark://1506e808d44e:37961/files/org.lz4_lz4-java-1.7.1.jar with timestamp 1627241077411
  116. [2021-07-25 19:24:40,254] {spark_submit.py:526} INFO - 21/07/25 19:24:40 INFO Utils: Copying /home/airflow/.ivy2/jars/org.lz4_lz4-java-1.7.1.jar to /tmp/spark-6d0b4ce8-4750-45f0-8074-d06e159ca333/userFiles-5865980b-a26d-4fc9-bd51-d874c5f857f4/org.lz4_lz4-java-1.7.1.jar
  117. [2021-07-25 19:24:40,281] {spark_submit.py:526} INFO - 21/07/25 19:24:40 INFO SparkContext: Added file file:///home/airflow/.ivy2/jars/org.xerial.snappy_snappy-java-1.1.8.2.jar at spark://1506e808d44e:37961/files/org.xerial.snappy_snappy-java-1.1.8.2.jar with timestamp 1627241077411
  118. [2021-07-25 19:24:40,281] {spark_submit.py:526} INFO - 21/07/25 19:24:40 INFO Utils: Copying /home/airflow/.ivy2/jars/org.xerial.snappy_snappy-java-1.1.8.2.jar to /tmp/spark-6d0b4ce8-4750-45f0-8074-d06e159ca333/userFiles-5865980b-a26d-4fc9-bd51-d874c5f857f4/org.xerial.snappy_snappy-java-1.1.8.2.jar
  119. [2021-07-25 19:24:40,313] {spark_submit.py:526} INFO - 21/07/25 19:24:40 INFO SparkContext: Added file file:///home/airflow/.ivy2/jars/org.slf4j_slf4j-api-1.7.30.jar at spark://1506e808d44e:37961/files/org.slf4j_slf4j-api-1.7.30.jar with timestamp 1627241077411
  120. [2021-07-25 19:24:40,313] {spark_submit.py:526} INFO - 21/07/25 19:24:40 INFO Utils: Copying /home/airflow/.ivy2/jars/org.slf4j_slf4j-api-1.7.30.jar to /tmp/spark-6d0b4ce8-4750-45f0-8074-d06e159ca333/userFiles-5865980b-a26d-4fc9-bd51-d874c5f857f4/org.slf4j_slf4j-api-1.7.30.jar
  121. [2021-07-25 19:24:40,983] {spark_submit.py:526} INFO - 21/07/25 19:24:40 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://spark:8080...
  122. [2021-07-25 19:24:41,105] {spark_submit.py:526} INFO - 21/07/25 19:24:41 INFO TransportClientFactory: Successfully created connection to spark/172.19.0.3:8080 after 70 ms (0 ms spent in bootstraps)
  123. [2021-07-25 19:24:41,156] {spark_submit.py:526} INFO - 21/07/25 19:24:41 WARN TransportChannelHandler: Exception in connection from spark/172.19.0.3:8080
  124. [2021-07-25 19:24:41,156] {spark_submit.py:526} INFO - java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
  125. [2021-07-25 19:24:41,156] {spark_submit.py:526} INFO - at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
  126. [2021-07-25 19:24:41,156] {spark_submit.py:526} INFO - at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
  127. [2021-07-25 19:24:41,157] {spark_submit.py:526} INFO - at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
  128. [2021-07-25 19:24:41,157] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  129. [2021-07-25 19:24:41,157] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  130. [2021-07-25 19:24:41,157] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
  131. [2021-07-25 19:24:41,157] {spark_submit.py:526} INFO - at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
  132. [2021-07-25 19:24:41,157] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  133. [2021-07-25 19:24:41,157] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  134. [2021-07-25 19:24:41,157] {spark_submit.py:526} INFO - at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
  135. [2021-07-25 19:24:41,157] {spark_submit.py:526} INFO - at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
  136. [2021-07-25 19:24:41,157] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714)
  137. [2021-07-25 19:24:41,157] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)
  138. [2021-07-25 19:24:41,157] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)
  139. [2021-07-25 19:24:41,157] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)
  140. [2021-07-25 19:24:41,158] {spark_submit.py:526} INFO - at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
  141. [2021-07-25 19:24:41,158] {spark_submit.py:526} INFO - at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
  142. [2021-07-25 19:24:41,158] {spark_submit.py:526} INFO - at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
  143. [2021-07-25 19:24:41,158] {spark_submit.py:526} INFO - at java.base/java.lang.Thread.run(Thread.java:829)
  144. [2021-07-25 19:24:41,160] {spark_submit.py:526} INFO - 21/07/25 19:24:41 ERROR TransportResponseHandler: Still have 1 requests outstanding when connection from spark/172.19.0.3:8080 is closed
  145. [2021-07-25 19:24:41,164] {spark_submit.py:526} INFO - 21/07/25 19:24:41 WARN StandaloneAppClient$ClientEndpoint: Could not connect to spark:8080: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
  146. [2021-07-25 19:24:41,165] {spark_submit.py:526} INFO - 21/07/25 19:24:41 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master spark:8080
  147. [2021-07-25 19:24:41,165] {spark_submit.py:526} INFO - org.apache.spark.SparkException: Exception thrown in awaitResult:
  148. [2021-07-25 19:24:41,166] {spark_submit.py:526} INFO - at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
  149. [2021-07-25 19:24:41,166] {spark_submit.py:526} INFO - at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
  150. [2021-07-25 19:24:41,166] {spark_submit.py:526} INFO - at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101)
  151. [2021-07-25 19:24:41,166] {spark_submit.py:526} INFO - at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109)
  152. [2021-07-25 19:24:41,166] {spark_submit.py:526} INFO - at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anon$1.run(StandaloneAppClient.scala:107)
  153. [2021-07-25 19:24:41,166] {spark_submit.py:526} INFO - at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
  154. [2021-07-25 19:24:41,166] {spark_submit.py:526} INFO - at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
  155. [2021-07-25 19:24:41,166] {spark_submit.py:526} INFO - at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
  156. [2021-07-25 19:24:41,166] {spark_submit.py:526} INFO - at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
  157. [2021-07-25 19:24:41,166] {spark_submit.py:526} INFO - at java.base/java.lang.Thread.run(Thread.java:829)
  158. [2021-07-25 19:24:41,166] {spark_submit.py:526} INFO - Caused by: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
  159. [2021-07-25 19:24:41,167] {spark_submit.py:526} INFO - at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
  160. [2021-07-25 19:24:41,167] {spark_submit.py:526} INFO - at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
  161. [2021-07-25 19:24:41,167] {spark_submit.py:526} INFO - at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
  162. [2021-07-25 19:24:41,167] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  163. [2021-07-25 19:24:41,167] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  164. [2021-07-25 19:24:41,167] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
  165. [2021-07-25 19:24:41,167] {spark_submit.py:526} INFO - at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
  166. [2021-07-25 19:24:41,167] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  167. [2021-07-25 19:24:41,167] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  168. [2021-07-25 19:24:41,167] {spark_submit.py:526} INFO - at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
  169. [2021-07-25 19:24:41,168] {spark_submit.py:526} INFO - at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
  170. [2021-07-25 19:24:41,168] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714)
  171. [2021-07-25 19:24:41,168] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)
  172. [2021-07-25 19:24:41,168] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)
  173. [2021-07-25 19:24:41,168] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)
  174. [2021-07-25 19:24:41,168] {spark_submit.py:526} INFO - at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
  175. [2021-07-25 19:24:41,168] {spark_submit.py:526} INFO - at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
  176. [2021-07-25 19:24:41,168] {spark_submit.py:526} INFO - at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
  177. [2021-07-25 19:24:41,168] {spark_submit.py:526} INFO - ... 1 more
  178. [2021-07-25 19:25:00,980] {spark_submit.py:526} INFO - 21/07/25 19:25:00 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://spark:8080...
  179. [2021-07-25 19:25:00,987] {spark_submit.py:526} INFO - 21/07/25 19:25:00 INFO TransportClientFactory: Found inactive connection to spark/172.19.0.3:8080, creating a new one.
  180. [2021-07-25 19:25:00,987] {spark_submit.py:526} INFO - 21/07/25 19:25:00 INFO TransportClientFactory: Successfully created connection to spark/172.19.0.3:8080 after 2 ms (0 ms spent in bootstraps)
  181. [2021-07-25 19:25:01,013] {spark_submit.py:526} INFO - 21/07/25 19:25:01 WARN TransportChannelHandler: Exception in connection from spark/172.19.0.3:8080
  182. [2021-07-25 19:25:01,015] {spark_submit.py:526} INFO - java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
  183. [2021-07-25 19:25:01,015] {spark_submit.py:526} INFO - at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
  184. [2021-07-25 19:25:01,015] {spark_submit.py:526} INFO - at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
  185. [2021-07-25 19:25:01,015] {spark_submit.py:526} INFO - at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
  186. [2021-07-25 19:25:01,016] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  187. [2021-07-25 19:25:01,016] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  188. [2021-07-25 19:25:01,016] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
  189. [2021-07-25 19:25:01,016] {spark_submit.py:526} INFO - at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
  190. [2021-07-25 19:25:01,016] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  191. [2021-07-25 19:25:01,016] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  192. [2021-07-25 19:25:01,016] {spark_submit.py:526} INFO - at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
  193. [2021-07-25 19:25:01,016] {spark_submit.py:526} INFO - at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
  194. [2021-07-25 19:25:01,016] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714)
  195. [2021-07-25 19:25:01,016] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)
  196. [2021-07-25 19:25:01,017] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)
  197. [2021-07-25 19:25:01,017] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)
  198. [2021-07-25 19:25:01,017] {spark_submit.py:526} INFO - at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
  199. [2021-07-25 19:25:01,018] {spark_submit.py:526} INFO - at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
  200. [2021-07-25 19:25:01,018] {spark_submit.py:526} INFO - at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
  201. [2021-07-25 19:25:01,018] {spark_submit.py:526} INFO - at java.base/java.lang.Thread.run(Thread.java:829)
  202. [2021-07-25 19:25:01,018] {spark_submit.py:526} INFO - 21/07/25 19:25:01 WARN StandaloneAppClient$ClientEndpoint: Could not connect to spark:8080: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
  203. [2021-07-25 19:25:01,027] {spark_submit.py:526} INFO - 21/07/25 19:25:01 ERROR TransportResponseHandler: Still have 1 requests outstanding when connection from spark/172.19.0.3:8080 is closed
  204. [2021-07-25 19:25:01,030] {spark_submit.py:526} INFO - 21/07/25 19:25:01 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master spark:8080
  205. [2021-07-25 19:25:01,033] {spark_submit.py:526} INFO - org.apache.spark.SparkException: Exception thrown in awaitResult:
  206. [2021-07-25 19:25:01,033] {spark_submit.py:526} INFO - at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
  207. [2021-07-25 19:25:01,033] {spark_submit.py:526} INFO - at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
  208. [2021-07-25 19:25:01,033] {spark_submit.py:526} INFO - at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101)
  209. [2021-07-25 19:25:01,034] {spark_submit.py:526} INFO - at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109)
  210. [2021-07-25 19:25:01,034] {spark_submit.py:526} INFO - at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anon$1.run(StandaloneAppClient.scala:107)
  211. [2021-07-25 19:25:01,034] {spark_submit.py:526} INFO - at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
  212. [2021-07-25 19:25:01,034] {spark_submit.py:526} INFO - at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
  213. [2021-07-25 19:25:01,041] {spark_submit.py:526} INFO - at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
  214. [2021-07-25 19:25:01,041] {spark_submit.py:526} INFO - at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
  215. [2021-07-25 19:25:01,041] {spark_submit.py:526} INFO - at java.base/java.lang.Thread.run(Thread.java:829)
  216. [2021-07-25 19:25:01,041] {spark_submit.py:526} INFO - Caused by: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
  217. [2021-07-25 19:25:01,041] {spark_submit.py:526} INFO - at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
  218. [2021-07-25 19:25:01,042] {spark_submit.py:526} INFO - at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
  219. [2021-07-25 19:25:01,042] {spark_submit.py:526} INFO - at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
  220. [2021-07-25 19:25:01,042] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  221. [2021-07-25 19:25:01,043] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  222. [2021-07-25 19:25:01,043] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
  223. [2021-07-25 19:25:01,043] {spark_submit.py:526} INFO - at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
  224. [2021-07-25 19:25:01,043] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  225. [2021-07-25 19:25:01,043] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  226. [2021-07-25 19:25:01,044] {spark_submit.py:526} INFO - at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
  227. [2021-07-25 19:25:01,044] {spark_submit.py:526} INFO - at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
  228. [2021-07-25 19:25:01,044] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714)
  229. [2021-07-25 19:25:01,044] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)
  230. [2021-07-25 19:25:01,044] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)
  231. [2021-07-25 19:25:01,044] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)
  232. [2021-07-25 19:25:01,045] {spark_submit.py:526} INFO - at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
  233. [2021-07-25 19:25:01,045] {spark_submit.py:526} INFO - at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
  234. [2021-07-25 19:25:01,045] {spark_submit.py:526} INFO - at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
  235. [2021-07-25 19:25:01,045] {spark_submit.py:526} INFO - ... 1 more
  236. [2021-07-25 19:25:20,981] {spark_submit.py:526} INFO - 21/07/25 19:25:20 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://spark:8080...
  237. [2021-07-25 19:25:20,988] {spark_submit.py:526} INFO - 21/07/25 19:25:20 INFO TransportClientFactory: Found inactive connection to spark/172.19.0.3:8080, creating a new one.
  238. [2021-07-25 19:25:21,000] {spark_submit.py:526} INFO - 21/07/25 19:25:20 INFO TransportClientFactory: Successfully created connection to spark/172.19.0.3:8080 after 11 ms (0 ms spent in bootstraps)
  239. [2021-07-25 19:25:21,049] {spark_submit.py:526} INFO - 21/07/25 19:25:21 WARN TransportChannelHandler: Exception in connection from spark/172.19.0.3:8080
  240. [2021-07-25 19:25:21,051] {spark_submit.py:526} INFO - java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
  241. [2021-07-25 19:25:21,051] {spark_submit.py:526} INFO - at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
  242. [2021-07-25 19:25:21,051] {spark_submit.py:526} INFO - at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
  243. [2021-07-25 19:25:21,052] {spark_submit.py:526} INFO - at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
  244. [2021-07-25 19:25:21,052] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  245. [2021-07-25 19:25:21,052] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  246. [2021-07-25 19:25:21,060] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
  247. [2021-07-25 19:25:21,061] {spark_submit.py:526} INFO - at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
  248. [2021-07-25 19:25:21,061] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  249. [2021-07-25 19:25:21,061] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  250. [2021-07-25 19:25:21,063] {spark_submit.py:526} INFO - at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
  251. [2021-07-25 19:25:21,068] {spark_submit.py:526} INFO - at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
  252. [2021-07-25 19:25:21,072] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714)
  253. [2021-07-25 19:25:21,075] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)
  254. [2021-07-25 19:25:21,075] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)
  255. [2021-07-25 19:25:21,075] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)
  256. [2021-07-25 19:25:21,076] {spark_submit.py:526} INFO - at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
  257. [2021-07-25 19:25:21,076] {spark_submit.py:526} INFO - at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
  258. [2021-07-25 19:25:21,076] {spark_submit.py:526} INFO - at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
  259. [2021-07-25 19:25:21,076] {spark_submit.py:526} INFO - at java.base/java.lang.Thread.run(Thread.java:829)
  260. [2021-07-25 19:25:21,076] {spark_submit.py:526} INFO - 21/07/25 19:25:21 WARN StandaloneAppClient$ClientEndpoint: Could not connect to spark:8080: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
  261. [2021-07-25 19:25:21,079] {spark_submit.py:526} INFO - 21/07/25 19:25:21 ERROR TransportResponseHandler: Still have 1 requests outstanding when connection from spark/172.19.0.3:8080 is closed
  262. [2021-07-25 19:25:21,079] {spark_submit.py:526} INFO - 21/07/25 19:25:21 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master spark:8080
  263. [2021-07-25 19:25:21,079] {spark_submit.py:526} INFO - org.apache.spark.SparkException: Exception thrown in awaitResult:
  264. [2021-07-25 19:25:21,079] {spark_submit.py:526} INFO - at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
  265. [2021-07-25 19:25:21,079] {spark_submit.py:526} INFO - at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
  266. [2021-07-25 19:25:21,079] {spark_submit.py:526} INFO - at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101)
  267. [2021-07-25 19:25:21,080] {spark_submit.py:526} INFO - at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109)
  268. [2021-07-25 19:25:21,080] {spark_submit.py:526} INFO - at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anon$1.run(StandaloneAppClient.scala:107)
  269. [2021-07-25 19:25:21,080] {spark_submit.py:526} INFO - at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
  270. [2021-07-25 19:25:21,080] {spark_submit.py:526} INFO - at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
  271. [2021-07-25 19:25:21,080] {spark_submit.py:526} INFO - at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
  272. [2021-07-25 19:25:21,080] {spark_submit.py:526} INFO - at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
  273. [2021-07-25 19:25:21,080] {spark_submit.py:526} INFO - at java.base/java.lang.Thread.run(Thread.java:829)
  274. [2021-07-25 19:25:21,080] {spark_submit.py:526} INFO - Caused by: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
  275. [2021-07-25 19:25:21,081] {spark_submit.py:526} INFO - at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
  276. [2021-07-25 19:25:21,081] {spark_submit.py:526} INFO - at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
  277. [2021-07-25 19:25:21,081] {spark_submit.py:526} INFO - at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
  278. [2021-07-25 19:25:21,081] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  279. [2021-07-25 19:25:21,081] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  280. [2021-07-25 19:25:21,081] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
  281. [2021-07-25 19:25:21,081] {spark_submit.py:526} INFO - at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
  282. [2021-07-25 19:25:21,081] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  283. [2021-07-25 19:25:21,081] {spark_submit.py:526} INFO - at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  284. [2021-07-25 19:25:21,082] {spark_submit.py:526} INFO - at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
  285. [2021-07-25 19:25:21,082] {spark_submit.py:526} INFO - at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
  286. [2021-07-25 19:25:21,082] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714)
  287. [2021-07-25 19:25:21,085] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)
  288. [2021-07-25 19:25:21,085] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)
  289. [2021-07-25 19:25:21,086] {spark_submit.py:526} INFO - at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)
  290. [2021-07-25 19:25:21,086] {spark_submit.py:526} INFO - at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
  291. [2021-07-25 19:25:21,086] {spark_submit.py:526} INFO - at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
  292. [2021-07-25 19:25:21,086] {spark_submit.py:526} INFO - at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
  293. [2021-07-25 19:25:21,086] {spark_submit.py:526} INFO - ... 1 more
  294. [2021-07-25 19:25:40,981] {spark_submit.py:526} INFO - 21/07/25 19:25:40 ERROR StandaloneSchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.
  295. [2021-07-25 19:25:40,981] {spark_submit.py:526} INFO - 21/07/25 19:25:40 WARN StandaloneSchedulerBackend: Application ID is not initialized yet.
  296. [2021-07-25 19:25:40,997] {spark_submit.py:526} INFO - 21/07/25 19:25:40 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 33831.
  297. [2021-07-25 19:25:40,998] {spark_submit.py:526} INFO - 21/07/25 19:25:40 INFO NettyBlockTransferService: Server created on 1506e808d44e:33831
  298. [2021-07-25 19:25:41,000] {spark_submit.py:526} INFO - 21/07/25 19:25:41 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
  299. [2021-07-25 19:25:41,006] {spark_submit.py:526} INFO - 21/07/25 19:25:41 INFO SparkUI: Stopped Spark web UI at http://1506e808d44e:4040
  300. [2021-07-25 19:25:41,018] {spark_submit.py:526} INFO - 21/07/25 19:25:41 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 1506e808d44e, 33831, None)
  301. [2021-07-25 19:25:41,021] {spark_submit.py:526} INFO - 21/07/25 19:25:41 INFO StandaloneSchedulerBackend: Shutting down all executors
  302. [2021-07-25 19:25:41,030] {spark_submit.py:526} INFO - 21/07/25 19:25:41 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking each executor to shut down
  303. [2021-07-25 19:25:41,031] {spark_submit.py:526} INFO - 21/07/25 19:25:41 INFO BlockManagerMasterEndpoint: Registering block manager 1506e808d44e:33831 with 434.4 MiB RAM, BlockManagerId(driver, 1506e808d44e, 33831, None)
  304. [2021-07-25 19:25:41,044] {spark_submit.py:526} INFO - 21/07/25 19:25:41 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 1506e808d44e, 33831, None)
  305. [2021-07-25 19:25:41,046] {spark_submit.py:526} INFO - 21/07/25 19:25:41 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 1506e808d44e, 33831, None)
  306. [2021-07-25 19:25:41,047] {spark_submit.py:526} INFO - 21/07/25 19:25:41 WARN StandaloneAppClient$ClientEndpoint: Drop UnregisterApplication(null) because has not yet connected to master
  307. [2021-07-25 19:25:41,080] {spark_submit.py:526} INFO - 21/07/25 19:25:41 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
  308. [2021-07-25 19:25:41,103] {spark_submit.py:526} INFO - 21/07/25 19:25:41 INFO MemoryStore: MemoryStore cleared
  309. [2021-07-25 19:25:41,104] {spark_submit.py:526} INFO - 21/07/25 19:25:41 INFO BlockManager: BlockManager stopped
  310. [2021-07-25 19:25:41,115] {spark_submit.py:526} INFO - 21/07/25 19:25:41 INFO BlockManagerMaster: BlockManagerMaster stopped
  311. [2021-07-25 19:25:41,120] {spark_submit.py:526} INFO - 21/07/25 19:25:41 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
  312. [2021-07-25 19:25:41,137] {spark_submit.py:526} INFO - 21/07/25 19:25:41 INFO SparkContext: Successfully stopped SparkContext
  313. [2021-07-25 19:25:41,277] {spark_submit.py:526} INFO - 21/07/25 19:25:41 ERROR SparkContext: Error initializing SparkContext.
  314. [2021-07-25 19:25:41,278] {spark_submit.py:526} INFO - java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
  315. [2021-07-25 19:25:41,279] {spark_submit.py:526} INFO - at scala.Predef$.require(Predef.scala:281)
  316. [2021-07-25 19:25:41,280] {spark_submit.py:526} INFO - at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:92)
  317. [2021-07-25 19:25:41,280] {spark_submit.py:526} INFO - at org.apache.spark.SparkContext.<init>(SparkContext.scala:597)
  318. [2021-07-25 19:25:41,281] {spark_submit.py:526} INFO - at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
  319. [2021-07-25 19:25:41,281] {spark_submit.py:526} INFO - at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  320. [2021-07-25 19:25:41,281] {spark_submit.py:526} INFO - at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
  321. [2021-07-25 19:25:41,281] {spark_submit.py:526} INFO - at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  322. [2021-07-25 19:25:41,281] {spark_submit.py:526} INFO - at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
  323. [2021-07-25 19:25:41,281] {spark_submit.py:526} INFO - at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
  324. [2021-07-25 19:25:41,281] {spark_submit.py:526} INFO - at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
  325. [2021-07-25 19:25:41,281] {spark_submit.py:526} INFO - at py4j.Gateway.invoke(Gateway.java:238)
  326. [2021-07-25 19:25:41,281] {spark_submit.py:526} INFO - at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
  327. [2021-07-25 19:25:41,281] {spark_submit.py:526} INFO - at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
  328. [2021-07-25 19:25:41,281] {spark_submit.py:526} INFO - at py4j.GatewayConnection.run(GatewayConnection.java:238)
  329. [2021-07-25 19:25:41,282] {spark_submit.py:526} INFO - at java.base/java.lang.Thread.run(Thread.java:829)
  330. [2021-07-25 19:25:41,282] {spark_submit.py:526} INFO - 21/07/25 19:25:41 INFO SparkContext: SparkContext already stopped.
  331. [2021-07-25 19:25:41,282] {spark_submit.py:526} INFO - Traceback (most recent call last):
  332. [2021-07-25 19:25:41,282] {spark_submit.py:526} INFO - File "/opt/airflow/dags/send_to_postgres.py", line 9, in <module>
  333. [2021-07-25 19:25:41,282] {spark_submit.py:526} INFO - .appName("myApp") \
  334. [2021-07-25 19:25:41,282] {spark_submit.py:526} INFO - File "/home/airflow/.local/lib/python3.6/site-packages/pyspark/python/lib/pyspark.zip/pyspark/sql/session.py", line 228, in getOrCreate
  335. [2021-07-25 19:25:41,282] {spark_submit.py:526} INFO - File "/home/airflow/.local/lib/python3.6/site-packages/pyspark/python/lib/pyspark.zip/pyspark/context.py", line 384, in getOrCreate
  336. [2021-07-25 19:25:41,282] {spark_submit.py:526} INFO - File "/home/airflow/.local/lib/python3.6/site-packages/pyspark/python/lib/pyspark.zip/pyspark/context.py", line 147, in __init__
  337. [2021-07-25 19:25:41,282] {spark_submit.py:526} INFO - File "/home/airflow/.local/lib/python3.6/site-packages/pyspark/python/lib/pyspark.zip/pyspark/context.py", line 209, in _do_init
  338. [2021-07-25 19:25:41,282] {spark_submit.py:526} INFO - File "/home/airflow/.local/lib/python3.6/site-packages/pyspark/python/lib/pyspark.zip/pyspark/context.py", line 321, in _initialize_context
  339. [2021-07-25 19:25:41,285] {spark_submit.py:526} INFO - File "/home/airflow/.local/lib/python3.6/site-packages/pyspark/python/lib/py4j-0.10.9-src.zip/py4j/java_gateway.py", line 1569, in __call__
  340. [2021-07-25 19:25:41,285] {spark_submit.py:526} INFO - File "/home/airflow/.local/lib/python3.6/site-packages/pyspark/python/lib/py4j-0.10.9-src.zip/py4j/protocol.py", line 328, in get_return_value
  341. [2021-07-25 19:25:41,285] {spark_submit.py:526} INFO - py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
  342. [2021-07-25 19:25:41,285] {spark_submit.py:526} INFO - : java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
  343. [2021-07-25 19:25:41,285] {spark_submit.py:526} INFO - at scala.Predef$.require(Predef.scala:281)
  344. [2021-07-25 19:25:41,285] {spark_submit.py:526} INFO - at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:92)
  345. [2021-07-25 19:25:41,287] {spark_submit.py:526} INFO - at org.apache.spark.SparkContext.<init>(SparkContext.scala:597)
  346. [2021-07-25 19:25:41,287] {spark_submit.py:526} INFO - at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
  347. [2021-07-25 19:25:41,287] {spark_submit.py:526} INFO - at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  348. [2021-07-25 19:25:41,287] {spark_submit.py:526} INFO - at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
  349. [2021-07-25 19:25:41,288] {spark_submit.py:526} INFO - at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  350. [2021-07-25 19:25:41,289] {spark_submit.py:526} INFO - at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
  351. [2021-07-25 19:25:41,289] {spark_submit.py:526} INFO - at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
  352. [2021-07-25 19:25:41,289] {spark_submit.py:526} INFO - at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
  353. [2021-07-25 19:25:41,290] {spark_submit.py:526} INFO - at py4j.Gateway.invoke(Gateway.java:238)
  354. [2021-07-25 19:25:41,290] {spark_submit.py:526} INFO - at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
  355. [2021-07-25 19:25:41,290] {spark_submit.py:526} INFO - at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
  356. [2021-07-25 19:25:41,290] {spark_submit.py:526} INFO - at py4j.GatewayConnection.run(GatewayConnection.java:238)
  357. [2021-07-25 19:25:41,290] {spark_submit.py:526} INFO - at java.base/java.lang.Thread.run(Thread.java:829)
  358. [2021-07-25 19:25:41,290] {spark_submit.py:526} INFO -
  359. [2021-07-25 19:25:41,383] {spark_submit.py:526} INFO - 21/07/25 19:25:41 INFO ShutdownHookManager: Shutdown hook called
  360. [2021-07-25 19:25:41,385] {spark_submit.py:526} INFO - 21/07/25 19:25:41 INFO ShutdownHookManager: Deleting directory /tmp/spark-6d0b4ce8-4750-45f0-8074-d06e159ca333
  361. [2021-07-25 19:25:41,398] {spark_submit.py:526} INFO - 21/07/25 19:25:41 INFO ShutdownHookManager: Deleting directory /tmp/spark-abe64501-15cd-4f02-88aa-b3479552b073
  362. [2021-07-25 19:25:41,482] {taskinstance.py:1482} ERROR - Task failed with exception
  363. Traceback (most recent call last):
  364. File "/home/airflow/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 1138, in _run_raw_task
  365. self._prepare_and_execute_task_with_callbacks(context, task)
  366. File "/home/airflow/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 1311, in _prepare_and_execute_task_with_callbacks
  367. result = self._execute_task(context, task_copy)
  368. File "/home/airflow/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 1341, in _execute_task
  369. result = task_copy.execute(context=context)
  370. File "/home/airflow/.local/lib/python3.6/site-packages/airflow/providers/apache/spark/operators/spark_submit.py", line 183, in execute
  371. self._hook.submit(self._application)
  372. File "/home/airflow/.local/lib/python3.6/site-packages/airflow/providers/apache/spark/hooks/spark_submit.py", line 455, in submit
  373. self._mask_cmd(spark_submit_cmd), returncode
  374. airflow.exceptions.AirflowException: Cannot execute: spark-submit --master spark://spark:8080 --packages org.apache.spark:spark-sql-kafka-0-10_2.12:3.1.2,postgresql:postgresql:9.1-901-1.jdbc4 --name arrow-spark /opt/airflow/dags/send_to_postgres.py. Error code is: 1.
  375. [2021-07-25 19:25:41,487] {taskinstance.py:1532} INFO - Marking task as UP_FOR_RETRY. dag_id=spark, task_id=spark-job, execution_date=20210725T192421, start_date=20210725T192423, end_date=20210725T192541
  376. [2021-07-25 19:25:41,540] {local_task_job.py:146} INFO - Task exited with return code 1
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement