Advertisement
Guest User

Data Fusion Error Log

a guest
Dec 15th, 2020
163
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 15.61 KB | None | 0 0
  1. 2020-12-16 05:25:25,044 - INFO [SparkRunnerDataStreamsSparkStreaming:i.c.c.d.DataStreamsSparkLauncher@81] - Pipeline '0e14cfa7-3f5f-11eb-8cf4-926eb109850e' is started by user 'root' with arguments logical.start.time=1608096320988, system.profile.name=SYSTEM:dataproc
  2. 2020-12-16 05:25:28,419 - INFO [SparkRunnerDataStreamsSparkStreaming:i.c.c.d.DataStreamsSparkLauncher@145] - Pipeline '0e14cfa7-3f5f-11eb-8cf4-926eb109850e' running
  3. 2020-12-16 05:25:28,699 - DEBUG [spark-submitter-DataStreamsSparkStreaming-17473e01-3f5f-11eb-afa9-6e362a42dcf1:i.c.c.a.r.s.s.AbstractSparkSubmitter@164] - Calling SparkSubmit for program:default.0e14cfa7-3f5f-11eb-8cf4-926eb109850e.-SNAPSHOT.spark.DataStreamsSparkStreaming 17473e01-3f5f-11eb-afa9-6e362a42dcf1: [--master, local[3], --conf, spark.app.name=DataStreamsSparkStreaming, --conf, spark.executor.memory=2048m, --conf, spark.driver.memory=2048m, --conf, spark.master=local[3], --conf, spark.local.dir=/data/preview/tmp/1608096322160-0, --conf, spark.driver.cores=1, --conf, spark.executor.id=0e14cfa7-3f5f-11eb-8cf4-926eb109850e, --conf, spark.ui.port=0, --conf, spark.executor.cores=1, --conf, spark.metrics.conf=/data/preview/tmp/1608096322160-0/metrics.properties, --conf, spark.app.id=0e14cfa7-3f5f-11eb-8cf4-926eb109850e, --conf, spark.streaming.backpressure.enabled=true, --conf, spark.maxRemoteBlockSizeFetchToMem=2147483135, --conf, spark.extraListeners=io.cdap.cdap.app.runtime.spark.DelegatingSparkListener, --conf, spark.rpc.netty.dispatcher.numThreads=3, --conf, spark.spark.streaming.blockInterval=2000, --conf, spark.executor.instances=3, --conf, spark.cdap.localized.resources=[], --class, io.cdap.cdap.app.runtime.spark.SparkMainWrapper, /data/preview/runtime/spark/cdapSparkJob.jar, --cdap.spark.program=program_run:default.0e14cfa7-3f5f-11eb-8cf4-926eb109850e.-SNAPSHOT.spark.DataStreamsSparkStreaming.17473e01-3f5f-11eb-afa9-6e362a42dcf1, --cdap.user.main.class=io.cdap.cdap.datastreams.SparkStreamingPipelineDriver]
  4. 2020-12-16 05:25:31,042 - INFO [spark-submitter-DataStreamsSparkStreaming-17473e01-3f5f-11eb-afa9-6e362a42dcf1:i.c.c.a.r.s.SparkMainWrapper$@78] - Launching user spark class class io.cdap.cdap.datastreams.SparkStreamingPipelineDriver
  5. 2020-12-16 05:25:33,186 - WARN [spark-submitter-DataStreamsSparkStreaming-17473e01-3f5f-11eb-afa9-6e362a42dcf1:o.a.s.SparkConf@66] - In Spark 1.0 and later spark.local.dir will be overridden by the value set by the cluster manager (via SPARK_LOCAL_DIRS in mesos/standalone and LOCAL_DIRS in YARN).
  6. 2020-12-16 05:25:38,013 - DEBUG [spark-submitter-DataStreamsSparkStreaming-17473e01-3f5f-11eb-afa9-6e362a42dcf1:i.c.c.a.r.s.SparkMetricsSink@51] - Using SparkMetricsSink for reporting metrics: {class=io.cdap.cdap.app.runtime.spark.SparkMetricsSink}
  7. 2020-12-16 05:25:39,018 - WARN [spark-submitter-DataStreamsSparkStreaming-17473e01-3f5f-11eb-afa9-6e362a42dcf1:o.a.s.s.StreamingContext@66] - StreamingContext has not been started yet
  8. 2020-12-16 05:25:39,800 - DEBUG [spark-submitter-DataStreamsSparkStreaming-17473e01-3f5f-11eb-afa9-6e362a42dcf1:i.c.c.a.r.s.SparkRuntimeEnv$@360] - Shutting down Server and ThreadPool used by Spark org.apache.spark.SparkContext@7a7f4b02
  9. 2020-12-16 05:25:39,872 - INFO [SparkRunnerDataStreamsSparkStreaming:i.c.c.d.DataStreamsSparkLauncher@153] - Pipeline '0e14cfa7-3f5f-11eb-8cf4-926eb109850e' failed
  10. 2020-12-16 05:25:39,881 - DEBUG [SparkRunnerDataStreamsSparkStreaming:i.c.c.a.r.s.SparkRuntimeService@893] - Running Spark shutdown hook org.apache.spark.util.SparkShutdownHookManager$$anon$2@139a0270
  11. 2020-12-16 05:25:39,941 - DEBUG [SparkRunnerDataStreamsSparkStreaming:i.c.c.a.r.s.SparkRuntimeService@376] - Spark program completed: SparkRuntimeContext{id=program:default.0e14cfa7-3f5f-11eb-8cf4-926eb109850e.-SNAPSHOT.spark.DataStreamsSparkStreaming, runId=17473e01-3f5f-11eb-afa9-6e362a42dcf1}
  12. 2020-12-16 05:25:40,064 - ERROR [SparkRunnerDataStreamsSparkStreaming:i.c.c.i.a.r.ProgramControllerServiceAdapter@92] - Spark Program 'DataStreamsSparkStreaming' failed.
  13. java.util.concurrent.ExecutionException: java.lang.NoClassDefFoundError: org/apache/spark/Logging
  14. at com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:294) ~[com.google.guava.guava-13.0.1.jar:na]
  15. at com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:281) ~[com.google.guava.guava-13.0.1.jar:na]
  16. at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116) ~[com.google.guava.guava-13.0.1.jar:na]
  17. at io.cdap.cdap.app.runtime.spark.SparkRuntimeService.run(SparkRuntimeService.java:346) ~[io.cdap.cdap.cdap-spark-core2_2.11-6.2.3.jar:na]
  18. at com.google.common.util.concurrent.AbstractExecutionThreadService$1$1.run(AbstractExecutionThreadService.java:52) ~[com.google.guava.guava-13.0.1.jar:na]
  19. at io.cdap.cdap.app.runtime.spark.SparkRuntimeService$5$1.run(SparkRuntimeService.java:404) [io.cdap.cdap.cdap-spark-core2_2.11-6.2.3.jar:na]
  20. at java.lang.Thread.run(Thread.java:748) [na:1.8.0_265]
  21. Caused by: java.lang.NoClassDefFoundError: org/apache/spark/Logging
  22. at java.lang.ClassLoader.defineClass1(Native Method) ~[na:1.8.0_265]
  23. at java.lang.ClassLoader.defineClass(ClassLoader.java:756) ~[na:1.8.0_265]
  24. at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) ~[na:1.8.0_265]
  25. at java.net.URLClassLoader.defineClass(URLClassLoader.java:468) ~[na:1.8.0_265]
  26. at java.net.URLClassLoader.access$100(URLClassLoader.java:74) ~[na:1.8.0_265]
  27. at java.net.URLClassLoader$1.run(URLClassLoader.java:369) ~[na:1.8.0_265]
  28. at java.net.URLClassLoader$1.run(URLClassLoader.java:363) ~[na:1.8.0_265]
  29. at java.security.AccessController.doPrivileged(Native Method) ~[na:1.8.0_265]
  30. at java.net.URLClassLoader.findClass(URLClassLoader.java:362) ~[na:1.8.0_265]
  31. at io.cdap.cdap.common.lang.InterceptableClassLoader.findClass(InterceptableClassLoader.java:44) ~[na:na]
  32. at java.lang.ClassLoader.loadClass(ClassLoader.java:418) ~[na:1.8.0_265]
  33. at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ~[na:1.8.0_265]
  34. at org.apache.spark.streaming.twitter.TwitterUtils$.createStream(TwitterUtils.scala:44) ~[na:na]
  35. at org.apache.spark.streaming.twitter.TwitterUtils$.createStream(TwitterUtils.scala:98) ~[na:na]
  36. at org.apache.spark.streaming.twitter.TwitterUtils.createStream(TwitterUtils.scala) ~[na:na]
  37. at io.cdap.plugin.spark.TwitterStreamingSourceUtil.getJavaDStream(TwitterStreamingSourceUtil.java:78) ~[na:na]
  38. at io.cdap.plugin.spark.TwitterStreamingSource.getStream(TwitterStreamingSource.java:72) ~[na:na]
  39. at io.cdap.cdap.etl.spark.plugin.WrappedStreamingSource$2.call(WrappedStreamingSource.java:76) ~[na:na]
  40. at io.cdap.cdap.etl.spark.plugin.WrappedStreamingSource$2.call(WrappedStreamingSource.java:73) ~[na:na]
  41. at io.cdap.cdap.etl.common.plugin.Caller$1.call(Caller.java:30) ~[na:na]
  42. at io.cdap.cdap.etl.common.plugin.StageLoggingCaller.call(StageLoggingCaller.java:40) ~[na:na]
  43. at io.cdap.cdap.etl.spark.plugin.WrappedStreamingSource.getStream(WrappedStreamingSource.java:73) ~[na:na]
  44. at io.cdap.cdap.datastreams.SparkStreamingPipelineRunner.getSource(SparkStreamingPipelineRunner.java:108) ~[na:na]
  45. at io.cdap.cdap.etl.spark.SparkPipelineRunner.runPipeline(SparkPipelineRunner.java:247) ~[na:na]
  46. at io.cdap.cdap.datastreams.SparkStreamingPipelineDriver.lambda$run$2243f9e4$1(SparkStreamingPipelineDriver.java:232) ~[na:na]
  47. at io.cdap.cdap.datastreams.SparkStreamingPipelineDriver.run(SparkStreamingPipelineDriver.java:244) ~[na:na]
  48. at io.cdap.cdap.datastreams.SparkStreamingPipelineDriver.run(SparkStreamingPipelineDriver.java:175) ~[na:na]
  49. at io.cdap.cdap.app.runtime.spark.SparkMainWrapper$.main(SparkMainWrapper.scala:87) ~[na:na]
  50. at io.cdap.cdap.app.runtime.spark.SparkMainWrapper.main(SparkMainWrapper.scala) ~[na:na]
  51. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_265]
  52. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_265]
  53. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_265]
  54. at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_265]
  55. at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:56) ~[na:na]
  56. at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894) ~[na:na]
  57. at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198) ~[na:na]
  58. at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228) ~[na:na]
  59. at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137) ~[na:na]
  60. at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) ~[na:na]
  61. at io.cdap.cdap.app.runtime.spark.submit.AbstractSparkSubmitter.submit(AbstractSparkSubmitter.java:170) ~[na:na]
  62. at io.cdap.cdap.app.runtime.spark.submit.AbstractSparkSubmitter.access$000(AbstractSparkSubmitter.java:54) ~[na:na]
  63. at io.cdap.cdap.app.runtime.spark.submit.AbstractSparkSubmitter$4.run(AbstractSparkSubmitter.java:109) ~[na:na]
  64. at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[na:1.8.0_265]
  65. at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[na:1.8.0_265]
  66. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[na:1.8.0_265]
  67. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[na:1.8.0_265]
  68. ... 1 common frames omitted
  69. Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging
  70. at java.net.URLClassLoader.findClass(URLClassLoader.java:382) ~[na:1.8.0_265]
  71. at io.cdap.cdap.common.lang.InterceptableClassLoader.findClass(InterceptableClassLoader.java:44) ~[na:na]
  72. at java.lang.ClassLoader.loadClass(ClassLoader.java:418) ~[na:1.8.0_265]
  73. at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ~[na:1.8.0_265]
  74. ... 47 common frames omitted
  75. 2020-12-16 05:25:40,111 - ERROR [SparkRunnerDataStreamsSparkStreaming:i.c.c.i.a.r.ProgramControllerServiceAdapter@93] - Spark program 'DataStreamsSparkStreaming' failed with error: org.apache.spark.Logging. Please check the system logs for more details.
  76. java.lang.ClassNotFoundException: org.apache.spark.Logging
  77. at java.net.URLClassLoader.findClass(URLClassLoader.java:382) ~[na:1.8.0_265]
  78. at io.cdap.cdap.common.lang.InterceptableClassLoader.findClass(InterceptableClassLoader.java:44) ~[na:na]
  79. at java.lang.ClassLoader.loadClass(ClassLoader.java:418) ~[na:1.8.0_265]
  80. at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ~[na:1.8.0_265]
  81. at java.lang.ClassLoader.defineClass1(Native Method) ~[na:1.8.0_265]
  82. at java.lang.ClassLoader.defineClass(ClassLoader.java:756) ~[na:1.8.0_265]
  83. at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) ~[na:1.8.0_265]
  84. at java.net.URLClassLoader.defineClass(URLClassLoader.java:468) ~[na:1.8.0_265]
  85. at java.net.URLClassLoader.access$100(URLClassLoader.java:74) ~[na:1.8.0_265]
  86. at java.net.URLClassLoader$1.run(URLClassLoader.java:369) ~[na:1.8.0_265]
  87. at java.net.URLClassLoader$1.run(URLClassLoader.java:363) ~[na:1.8.0_265]
  88. at java.security.AccessController.doPrivileged(Native Method) ~[na:1.8.0_265]
  89. at java.net.URLClassLoader.findClass(URLClassLoader.java:362) ~[na:1.8.0_265]
  90. at io.cdap.cdap.common.lang.InterceptableClassLoader.findClass(InterceptableClassLoader.java:44) ~[na:na]
  91. at java.lang.ClassLoader.loadClass(ClassLoader.java:418) ~[na:1.8.0_265]
  92. at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ~[na:1.8.0_265]
  93. at org.apache.spark.streaming.twitter.TwitterUtils$.createStream(TwitterUtils.scala:44) ~[na:na]
  94. at org.apache.spark.streaming.twitter.TwitterUtils$.createStream(TwitterUtils.scala:98) ~[na:na]
  95. at org.apache.spark.streaming.twitter.TwitterUtils.createStream(TwitterUtils.scala) ~[na:na]
  96. at io.cdap.plugin.spark.TwitterStreamingSourceUtil.getJavaDStream(TwitterStreamingSourceUtil.java:78) ~[na:na]
  97. at io.cdap.plugin.spark.TwitterStreamingSource.getStream(TwitterStreamingSource.java:72) ~[na:na]
  98. at io.cdap.cdap.etl.spark.plugin.WrappedStreamingSource$2.call(WrappedStreamingSource.java:76) ~[na:na]
  99. at io.cdap.cdap.etl.spark.plugin.WrappedStreamingSource$2.call(WrappedStreamingSource.java:73) ~[na:na]
  100. at io.cdap.cdap.etl.common.plugin.Caller$1.call(Caller.java:30) ~[na:na]
  101. at io.cdap.cdap.etl.common.plugin.StageLoggingCaller.call(StageLoggingCaller.java:40) ~[na:na]
  102. at io.cdap.cdap.etl.spark.plugin.WrappedStreamingSource.getStream(WrappedStreamingSource.java:73) ~[na:na]
  103. at io.cdap.cdap.datastreams.SparkStreamingPipelineRunner.getSource(SparkStreamingPipelineRunner.java:108) ~[na:na]
  104. at io.cdap.cdap.etl.spark.SparkPipelineRunner.runPipeline(SparkPipelineRunner.java:247) ~[na:na]
  105. at io.cdap.cdap.datastreams.SparkStreamingPipelineDriver.lambda$run$2243f9e4$1(SparkStreamingPipelineDriver.java:232) ~[na:na]
  106. at io.cdap.cdap.datastreams.SparkStreamingPipelineDriver.run(SparkStreamingPipelineDriver.java:244) ~[na:na]
  107. at io.cdap.cdap.datastreams.SparkStreamingPipelineDriver.run(SparkStreamingPipelineDriver.java:175) ~[na:na]
  108. at io.cdap.cdap.app.runtime.spark.SparkMainWrapper$.main(SparkMainWrapper.scala:87) ~[na:na]
  109. at io.cdap.cdap.app.runtime.spark.SparkMainWrapper.main(SparkMainWrapper.scala) ~[na:na]
  110. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_265]
  111. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_265]
  112. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_265]
  113. at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_265]
  114. at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:56) ~[na:na]
  115. at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894) ~[na:na]
  116. at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198) ~[na:na]
  117. at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228) ~[na:na]
  118. at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137) ~[na:na]
  119. at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) ~[na:na]
  120. at io.cdap.cdap.app.runtime.spark.submit.AbstractSparkSubmitter.submit(AbstractSparkSubmitter.java:170) ~[na:na]
  121. at io.cdap.cdap.app.runtime.spark.submit.AbstractSparkSubmitter.access$000(AbstractSparkSubmitter.java:54) ~[na:na]
  122. at io.cdap.cdap.app.runtime.spark.submit.AbstractSparkSubmitter$4.run(AbstractSparkSubmitter.java:109) ~[na:na]
  123. at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[na:1.8.0_265]
  124. at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[na:1.8.0_265]
  125. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[na:1.8.0_265]
  126. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[na:1.8.0_265]
  127. at java.lang.Thread.run(Thread.java:748) [na:1.8.0_265]
  128. 2020-12-16 05:25:40,139 - DEBUG [pcontroller-program:default.0e14cfa7-3f5f-11eb-8cf4-926eb109850e.-SNAPSHOT.spark.DataStreamsSparkStreaming-17473e01-3f5f-11eb-afa9-6e362a42dcf1-2:i.c.c.i.a.p.DefaultPreviewRunner@284] - Setting preview status for program:default.0e14cfa7-3f5f-11eb-8cf4-926eb109850e.-SNAPSHOT.spark.DataStreamsSparkStreaming to RUN_FAILED
  129. 2020-12-16 05:25:40,154 - DEBUG [pcontroller-program:default.0e14cfa7-3f5f-11eb-8cf4-926eb109850e.-SNAPSHOT.spark.DataStreamsSparkStreaming-17473e01-3f5f-11eb-afa9-6e362a42dcf1-2:i.c.c.a.r.AbstractProgramRuntimeService@561] - Removing RuntimeInfo: Spark DataStreamsSparkStreaming 17473e01-3f5f-11eb-afa9-6e362a42dcf1
  130. 2020-12-16 05:25:40,154 - DEBUG [pcontroller-program:default.0e14cfa7-3f5f-11eb-8cf4-926eb109850e.-SNAPSHOT.spark.DataStreamsSparkStreaming-17473e01-3f5f-11eb-afa9-6e362a42dcf1-2:i.c.c.a.r.AbstractProgramRuntimeService@564] - RuntimeInfo removed: RuntimeInfo{programId=program:default.0e14cfa7-3f5f-11eb-8cf4-926eb109850e.-SNAPSHOT.spark.DataStreamsSparkStreaming, twillRunId=null}
  131.  
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement