Advertisement
aironman

stacktrace amazonkafka

May 24th, 2016
228
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 14.50 KB | None | 0 0
  1. 6/05/24 16:55:44 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1006
  2. 16/05/24 16:55:44 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (KafkaRDD[74] at createDirectStream at AmazonKafkaConnectorWithMongo.scala:100)
  3. 16/05/24 16:55:44 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
  4. 16/05/24 16:55:44 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, partition 0,ANY, 2024 bytes)
  5. 16/05/24 16:55:44 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
  6. 16/05/24 16:55:44 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
  7. java.lang.ClassCastException: org.apache.spark.util.SerializableConfiguration cannot be cast to [B
  8. at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
  9. at org.apache.spark.scheduler.Task.run(Task.scala:89)
  10. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
  11. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  12. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  13. at java.lang.Thread.run(Thread.java:745)
  14. 16/05/24 16:55:44 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, localhost): java.lang.ClassCastException: org.apache.spark.util.SerializableConfiguration cannot be cast to [B
  15. at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
  16. at org.apache.spark.scheduler.Task.run(Task.scala:89)
  17. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
  18. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  19. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  20. at java.lang.Thread.run(Thread.java:745)
  21.  
  22. 16/05/24 16:55:44 ERROR TaskSetManager: Task 0 in stage 0.0 failed 1 times; aborting job
  23. 16/05/24 16:55:44 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
  24. 16/05/24 16:55:44 INFO TaskSchedulerImpl: Cancelling stage 0
  25. 16/05/24 16:55:44 INFO DAGScheduler: ResultStage 0 (runJob at KafkaRDD.scala:98) failed in 0,020 s
  26. 16/05/24 16:55:44 INFO DAGScheduler: Job 0 failed: runJob at KafkaRDD.scala:98, took 0,031171 s
  27. 16/05/24 16:55:44 INFO JobScheduler: Finished job streaming job 1464101744000 ms.0 from job set of time 1464101744000 ms
  28. 16/05/24 16:55:44 INFO JobScheduler: Total delay: 0,051 s for time 1464101744000 ms (execution: 0,042 s)
  29. 16/05/24 16:55:44 INFO KafkaRDD: Removing RDD 73 from persistence list
  30. 16/05/24 16:55:44 INFO BlockManager: Removing RDD 73
  31. 16/05/24 16:55:44 INFO ReceivedBlockTracker: Deleting batches ArrayBuffer()
  32. 16/05/24 16:55:44 INFO InputInfoTracker: remove old batch metadata: 1464101740000 ms
  33. 16/05/24 16:55:44 ERROR JobScheduler: Error running job streaming job 1464101744000 ms.0
  34. org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost): java.lang.ClassCastException: org.apache.spark.util.SerializableConfiguration cannot be cast to [B
  35. at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
  36. at org.apache.spark.scheduler.Task.run(Task.scala:89)
  37. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
  38. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  39. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  40. at java.lang.Thread.run(Thread.java:745)
  41.  
  42. Driver stacktrace:
  43. at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)
  44. at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419)
  45. at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1418)
  46. at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
  47. at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
  48. at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1418)
  49. at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
  50. at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
  51. at scala.Option.foreach(Option.scala:236)
  52. at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:799)
  53. at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1640)
  54. at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1599)
  55. at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1588)
  56. at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
  57. at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)
  58. at org.apache.spark.SparkContext.runJob(SparkContext.scala:1832)
  59. at org.apache.spark.SparkContext.runJob(SparkContext.scala:1845)
  60. at org.apache.spark.streaming.kafka.KafkaRDD.take(KafkaRDD.scala:98)
  61. at example.spark.AmazonKafkaConnector$$anonfun$main$1.apply(AmazonKafkaConnectorWithMongo.scala:117)
  62. at example.spark.AmazonKafkaConnector$$anonfun$main$1.apply(AmazonKafkaConnectorWithMongo.scala:113)
  63. at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:661)
  64. at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:661)
  65. at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:50)
  66. at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:50)
  67. at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:50)
  68. at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:426)
  69. at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:49)
  70. at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:49)
  71. at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:49)
  72. at scala.util.Try$.apply(Try.scala:161)
  73. at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)
  74. at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:224)
  75. at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:224)
  76. at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:224)
  77. at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
  78. at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:223)
  79. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  80. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  81. at java.lang.Thread.run(Thread.java:745)
  82. Caused by: java.lang.ClassCastException: org.apache.spark.util.SerializableConfiguration cannot be cast to [B
  83. at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
  84. at org.apache.spark.scheduler.Task.run(Task.scala:89)
  85. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
  86. ... 3 more
  87. Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost): java.lang.ClassCastException: org.apache.spark.util.SerializableConfiguration cannot be cast to [B
  88. at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
  89. at org.apache.spark.scheduler.Task.run(Task.scala:89)
  90. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
  91. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  92. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  93. at java.lang.Thread.run(Thread.java:745)
  94.  
  95. Driver stacktrace:
  96. at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)
  97. at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419)
  98. at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1418)
  99. at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
  100. at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
  101. at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1418)
  102. at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
  103. at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
  104. at scala.Option.foreach(Option.scala:236)
  105. at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:799)
  106. at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1640)
  107. at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1599)
  108. at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1588)
  109. at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
  110. at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)
  111. at org.apache.spark.SparkContext.runJob(SparkContext.scala:1832)
  112. at org.apache.spark.SparkContext.runJob(SparkContext.scala:1845)
  113. at org.apache.spark.streaming.kafka.KafkaRDD.take(KafkaRDD.scala:98)
  114. at example.spark.AmazonKafkaConnector$$anonfun$main$1.apply(AmazonKafkaConnectorWithMongo.scala:117)
  115. at example.spark.AmazonKafkaConnector$$anonfun$main$1.apply(AmazonKafkaConnectorWithMongo.scala:113)
  116. at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:661)
  117. at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:661)
  118. at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:50)
  119. at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:50)
  120. at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:50)
  121. at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:426)
  122. at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:49)
  123. at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:49)
  124. at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:49)
  125. at scala.util.Try$.apply(Try.scala:161)
  126. at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)
  127. at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:224)
  128. at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:224)
  129. at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:224)
  130. at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
  131. at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:223)
  132. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  133. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  134. at java.lang.Thread.run(Thread.java:745)
  135. Caused by: java.lang.ClassCastException: org.apache.spark.util.SerializableConfiguration cannot be cast to [B
  136. at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
  137. at org.apache.spark.scheduler.Task.run(Task.scala:89)
  138. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
  139. ... 3 more
  140. 16/05/24 16:55:44 INFO StreamingContext: Invoking stop(stopGracefully=false) from shutdown hook
  141. 16/05/24 16:55:44 INFO JobGenerator: Stopping JobGenerator immediately
  142. 16/05/24 16:55:44 INFO RecurringTimer: Stopped timer for JobGenerator after time 1464101744000
  143. 16/05/24 16:55:44 INFO JobGenerator: Stopped JobGenerator
  144. 16/05/24 16:55:44 INFO JobScheduler: Stopped JobScheduler
  145. 16/05/24 16:55:44 INFO StreamingContext: StreamingContext stopped successfully
  146. 16/05/24 16:55:44 INFO SparkContext: Invoking stop() from shutdown hook
  147. 16/05/24 16:55:44 INFO SparkUI: Stopped Spark web UI at http://localhost:4041
  148. 16/05/24 16:55:44 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
  149. 16/05/24 16:55:44 INFO MemoryStore: MemoryStore cleared
  150. 16/05/24 16:55:44 INFO BlockManager: BlockManager stopped
  151. 16/05/24 16:55:44 INFO BlockManagerMaster: BlockManagerMaster stopped
  152. 16/05/24 16:55:44 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
  153. 16/05/24 16:55:44 INFO SparkContext: Successfully stopped SparkContext
  154. 16/05/24 16:55:44 INFO SparkContext: Invoking stop() from shutdown hook
  155. 16/05/24 16:55:44 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
  156. 16/05/24 16:55:44 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
  157. 16/05/24 16:55:44 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
  158. 16/05/24 16:55:44 INFO SparkUI: Stopped Spark web UI at http://192.168.1.35:4040
  159. 16/05/24 16:55:44 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
  160. 16/05/24 16:55:44 INFO MemoryStore: MemoryStore cleared
  161. 16/05/24 16:55:44 INFO BlockManager: BlockManager stopped
  162. 16/05/24 16:55:44 INFO BlockManagerMaster: BlockManagerMaster stopped
  163. 16/05/24 16:55:44 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
  164. 16/05/24 16:55:44 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
  165. 16/05/24 16:55:44 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
  166. 16/05/24 16:55:44 INFO SparkContext: Successfully stopped SparkContext
  167. 16/05/24 16:55:44 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
  168. 16/05/24 16:55:44 INFO ShutdownHookManager: Shutdown hook called
  169. 16/05/24 16:55:44 INFO ShutdownHookManager: Deleting directory /private/var/folders/gn/pzkybyfd2g5bpyh47q0pp5nc0000gn/T/spark-cf3acb3b-c056-4672-9396-596c7469b32a
  170. 16/05/24 16:55:44 INFO ShutdownHookManager: Deleting directory /private/var/folders/gn/pzkybyfd2g5bpyh47q0pp5nc0000gn/T/spark-cf3acb3b-c056-4672-9396-596c7469b32a/httpd-e3bb8815-8a12-46fd-baff-64d19edd1187
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement