Advertisement
Guest User

SparkPi Driver Pod Log

a guest
May 2nd, 2020
366
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
Bash 9.90 KB | None | 0 0
  1. vagrant@kmaster:~/spark-2.4.5-bin-hadoop2.7$ kubectl logs spark-pi-1587656135431-driver
  2. ++ id -u
  3. + myuid=0
  4. ++ id -g
  5. + mygid=0
  6. + set +e
  7. ++ getent passwd 0
  8. + uidentry=root:x:0:0:root:/root:/bin/bash
  9. + set -e
  10. + '[' -z root:x:0:0:root:/root:/bin/bash ']'
  11. + SPARK_K8S_CMD=driver
  12. + case "$SPARK_K8S_CMD" in
  13. + shift 1
  14. + SPARK_CLASSPATH=':/opt/spark/jars/*'
  15. + env
  16. + sed 's/[^=]*=\(.*\)/\1/g'
  17. + sort -t_ -k4 -n
  18. + grep SPARK_JAVA_OPT_
  19. + readarray -t SPARK_EXECUTOR_JAVA_OPTS
  20. + '[' -n '' ']'
  21. + '[' -n '' ']'
  22. + PYSPARK_ARGS=
  23. + '[' -n '' ']'
  24. + R_ARGS=
  25. + '[' -n '' ']'
  26. + '[' '' == 2 ']'
  27. + '[' '' == 3 ']'
  28. + case "$SPARK_K8S_CMD" in
  29. + CMD=("$SPARK_HOME/bin/spark-submit" --conf "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client "$@")
  30. + exec /usr/bin/tini -s -- /opt/spark/bin/spark-submit --conf spark.driver.bindAddress=192.168.41.156 --deploy-mode client --properties-file /opt/spark/conf/spark.properties --class org.apache.spark.examples.SparkPi spark-internal 10000000
  31. 20/04/23 15:35:59 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  32. Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
  33. 20/04/23 15:36:00 INFO SparkContext: Running Spark version 2.4.5
  34. 20/04/23 15:36:00 INFO SparkContext: Submitted application: Spark Pi
  35. 20/04/23 15:36:00 INFO SecurityManager: Changing view acls to: root
  36. 20/04/23 15:36:00 INFO SecurityManager: Changing modify acls to: root
  37. 20/04/23 15:36:00 INFO SecurityManager: Changing view acls groups to:
  38. 20/04/23 15:36:00 INFO SecurityManager: Changing modify acls groups to:
  39. 20/04/23 15:36:00 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
  40. 20/04/23 15:36:00 INFO Utils: Successfully started service 'sparkDriver' on port 7078.
  41. 20/04/23 15:36:00 INFO SparkEnv: Registering MapOutputTracker
  42. 20/04/23 15:36:00 INFO SparkEnv: Registering BlockManagerMaster
  43. 20/04/23 15:36:00 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
  44. 20/04/23 15:36:00 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
  45. 20/04/23 15:36:00 INFO DiskBlockManager: Created local directory at /var/data/spark-6ef68f6e-878f-4924-bb80-4d7d8561989c/blockmgr-852d1b3b-dd24-4108-a6da-42446e8aef49
  46. 20/04/23 15:36:00 INFO MemoryStore: MemoryStore started with capacity 413.9 MB
  47. 20/04/23 15:36:00 INFO SparkEnv: Registering OutputCommitCoordinator
  48. 20/04/23 15:36:01 INFO Utils: Successfully started service 'SparkUI' on port 4040.
  49. 20/04/23 15:36:01 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://spark-pi-1587656135431-driver-svc.default.svc:4040
  50. 20/04/23 15:36:01 INFO SparkContext: Added JAR file:///opt/spark/examples/jars/spark-examples_2.11-2.4.5.jar at spark://spark-pi-1587656135431-driver-svc.default.svc:7078/jars/spark-examples_2.11-2.4.5.jar with timestamp 1587656161394
  51. 20/04/23 15:36:02 ERROR SparkContext: Error initializing SparkContext.
  52. org.apache.spark.SparkException: External scheduler cannot be instantiated
  53.     at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2794)
  54.     at org.apache.spark.SparkContext.<init>(SparkContext.scala:493)
  55.     at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
  56.     at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
  57.     at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
  58.     at scala.Option.getOrElse(Option.scala:121)
  59.     at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
  60.     at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
  61.     at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
  62.     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  63.     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  64.     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  65.     at java.lang.reflect.Method.invoke(Method.java:498)
  66.     at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
  67.     at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
  68.     at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
  69.     at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
  70.     at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
  71.     at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
  72.     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
  73.     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
  74. Caused by: io.fabric8.kubernetes.client.KubernetesClientException: Operation: [get]  for kind: [Pod]  with name: [spark-pi-1587656135431-driver]  in namespace: [default]  failed.
  75.     at io.fabric8.kubernetes.client.KubernetesClientException.launderThrowable(KubernetesClientException.java:64)
  76.     at io.fabric8.kubernetes.client.KubernetesClientException.launderThrowable(KubernetesClientException.java:72)
  77.     at io.fabric8.kubernetes.client.dsl.base.BaseOperation.getMandatory(BaseOperation.java:237)
  78.     at io.fabric8.kubernetes.client.dsl.base.BaseOperation.get(BaseOperation.java:170)
  79.     at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator$$anonfun$1.apply(ExecutorPodsAllocator.scala:57)
  80.     at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator$$anonfun$1.apply(ExecutorPodsAllocator.scala:55)
  81.     at scala.Option.map(Option.scala:146)
  82.     at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.<init>(ExecutorPodsAllocator.scala:55)
  83.     at org.apache.spark.scheduler.cluster.k8s.KubernetesClusterManager.createSchedulerBackend(KubernetesClusterManager.scala:89)
  84.     at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2788)
  85.     ... 20 more
  86. Caused by: java.net.SocketException: Broken pipe (Write failed)
  87.     at java.net.SocketOutputStream.socketWrite0(Native Method)
  88.     at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:111)
  89.     at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
  90.     at sun.security.ssl.OutputRecord.writeBuffer(OutputRecord.java:431)
  91.     at sun.security.ssl.OutputRecord.write(OutputRecord.java:417)
  92.     at sun.security.ssl.SSLSocketImpl.writeRecordInternal(SSLSocketImpl.java:894)
  93.     ... many more (redacted to fit body)
  94. 20/04/23 15:36:02 INFO SparkUI: Stopped Spark web UI at http://spark-pi-1587656135431-driver-svc.default.svc:4040
  95. 20/04/23 15:36:03 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
  96. 20/04/23 15:36:03 INFO MemoryStore: MemoryStore cleared
  97. 20/04/23 15:36:03 INFO BlockManager: BlockManager stopped
  98. 20/04/23 15:36:03 INFO BlockManagerMaster: BlockManagerMaster stopped
  99. 20/04/23 15:36:03 WARN MetricsSystem: Stopping a MetricsSystem that is not running
  100. 20/04/23 15:36:03 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
  101. 20/04/23 15:36:03 INFO SparkContext: Successfully stopped SparkContext
  102. Exception in thread "main" org.apache.spark.SparkException: External scheduler cannot be instantiated
  103.     at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2794)
  104.     at org.apache.spark.SparkContext.<init>(SparkContext.scala:493)
  105.     at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
  106.     at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
  107.     at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
  108.     at scala.Option.getOrElse(Option.scala:121)
  109.     ... many more (redacted to fit body)
  110. Caused by: io.fabric8.kubernetes.client.KubernetesClientException: Operation: [get]  for kind: [Pod]  with name: [spark-pi-1587656135431-driver]  in namespace: [default]  failed.
  111.     at io.fabric8.kubernetes.client.KubernetesClientException.launderThrowable(KubernetesClientException.java:64)
  112.     at io.fabric8.kubernetes.client.KubernetesClientException.launderThrowable(KubernetesClientException.java:72)
  113.     at io.fabric8.kubernetes.client.dsl.base.BaseOperation.getMandatory(BaseOperation.java:237)
  114.     at io.fabric8.kubernetes.client.dsl.base.BaseOperation.get(BaseOperation.java:170)
  115.     at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator$$anonfun$1.apply(ExecutorPodsAllocator.scala:57)
  116.     at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator$$anonfun$1.apply(ExecutorPodsAllocator.scala:55)
  117.     at scala.Option.map(Option.scala:146)
  118.     at org.apache.spark.scheduler.cluster.k8s.ExecutorPodsAllocator.<init>(ExecutorPodsAllocator.scala:55)
  119.     at org.apache.spark.scheduler.cluster.k8s.KubernetesClusterManager.createSchedulerBackend(KubernetesClusterManager.scala:89)
  120.     at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2788)
  121.     ... 20 more
  122. Caused by: java.net.SocketException: Broken pipe (Write failed)
  123.     at java.net.SocketOutputStream.socketWrite0(Native Method)
  124.     at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:111)
  125.     at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
  126.     at sun.security.ssl.OutputRecord.writeBuffer(OutputRecord.java:431)
  127.     at sun.security.ssl.OutputRecord.write(OutputRecord.java:417)
  128.     at sun.security.ssl.SSLSocketImpl.writeRecordInternal(SSLSocketImpl.java:894)
  129.     at sun.security.ssl.SSLSocketImpl.writeRecord(SSLSocketImpl.java:865)
  130.     at sun.security.ssl.AppOutputStream.write(AppOutputStream.java:123)
  131.     at okio.Okio$1.write(Okio.java:79)
  132.     ... many more (redacted to fit body)
  133. 20/04/23 15:36:03 INFO ShutdownHookManager: Shutdown hook called
  134. 20/04/23 15:36:03 INFO ShutdownHookManager: Deleting directory /var/data/spark-6ef68f6e-878f-4924-bb80-4d7d8561989c/spark-07474c84-2ebb-49a0-a71d-3d716c95839b
  135. 20/04/23 15:36:03 INFO ShutdownHookManager: Deleting directory /tmp/spark-901da352-f688-4010-a937-f87d169be25e
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement