Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- 2015-01-21 19:00:22,211 INFO [sparkDriver-akka.actor.default-dispatcher-3] storage.BlockManager (Logging.scala:logInfo(59)) - Removing broadcast 4
- 2015-01-21 19:00:22,211 INFO [sparkDriver-akka.actor.default-dispatcher-3] storage.BlockManager (Logging.scala:logInfo(59)) - Removing block broadcast_4
- 2015-01-21 19:00:22,211 INFO [sparkDriver-akka.actor.default-dispatcher-3] storage.MemoryStore (Logging.scala:logInfo(59)) - Block broadcast_4 of size 2856 dropped from memory (free 257906076)
- 2015-01-21 19:00:22,211 INFO [sparkDriver-akka.actor.default-dispatcher-3] storage.BlockManager (Logging.scala:logInfo(59)) - Removing block broadcast_4_piece0
- 2015-01-21 19:00:22,211 INFO [sparkDriver-akka.actor.default-dispatcher-3] storage.MemoryStore (Logging.scala:logInfo(59)) - Block broadcast_4_piece0 of size 1730 dropped from memory (free 257907806)
- 2015-01-21 19:00:22,212 INFO [sparkDriver-akka.actor.default-dispatcher-2] storage.BlockManagerInfo (Logging.scala:logInfo(59)) - Removed broadcast_4_piece0 on ip-172-31-49-145.ec2.internal:53512 in memory (size: 1730.0 B, free: 246.0 MB)
- 2015-01-21 19:00:22,212 INFO [sparkDriver-akka.actor.default-dispatcher-3] storage.BlockManagerMaster (Logging.scala:logInfo(59)) - Updated info of block broadcast_4_piece0
- 2015-01-21 19:00:22,213 INFO [main] spark.SparkContext (Logging.scala:logInfo(59)) - Starting job: collect at <console>:12
- 2015-01-21 19:00:22,214 INFO [sparkDriver-akka.actor.default-dispatcher-3] scheduler.DAGScheduler (Logging.scala:logInfo(59)) - Got job 5 (collect at <console>:12) with 2 output partitions (allowLocal=false)
- 2015-01-21 19:00:22,214 INFO [sparkDriver-akka.actor.default-dispatcher-3] scheduler.DAGScheduler (Logging.scala:logInfo(59)) - Final stage: Stage 5(collect at <console>:12)
- 2015-01-21 19:00:22,214 INFO [sparkDriver-akka.actor.default-dispatcher-3] scheduler.DAGScheduler (Logging.scala:logInfo(59)) - Parents of final stage: List()
- 2015-01-21 19:00:22,215 INFO [sparkDriver-akka.actor.default-dispatcher-3] scheduler.DAGScheduler (Logging.scala:logInfo(59)) - Missing parents: List()
- 2015-01-21 19:00:22,216 INFO [sparkDriver-akka.actor.default-dispatcher-3] scheduler.DAGScheduler (Logging.scala:logInfo(59)) - Submitting Stage 5 (UnionRDD[19] at union at <console>:12), which has no missing parents
- 2015-01-21 19:00:22,217 INFO [sparkDriver-akka.actor.default-dispatcher-3] storage.MemoryStore (Logging.scala:logInfo(59)) - ensureFreeSpace(2856) called with curMem=10432, maxMem=257918238
- 2015-01-21 19:00:22,218 INFO [sparkDriver-akka.actor.default-dispatcher-3] storage.MemoryStore (Logging.scala:logInfo(59)) - Block broadcast_5 stored as values in memory (estimated size 2.8 KB, free 246.0 MB)
- 2015-01-21 19:00:22,218 INFO [sparkDriver-akka.actor.default-dispatcher-13] storage.BlockManagerInfo (Logging.scala:logInfo(59)) - Removed broadcast_4_piece0 on ip-172-31-60-199.ec2.internal:57517 in memory (size: 1730.0 B, free: 535.0 MB)
- 2015-01-21 19:00:22,219 INFO [sparkDriver-akka.actor.default-dispatcher-13] storage.BlockManagerInfo (Logging.scala:logInfo(59)) - Removed broadcast_4_piece0 on ip-172-31-60-198.ec2.internal:32912 in memory (size: 1730.0 B, free: 535.0 MB)
- 2015-01-21 19:00:22,219 INFO [sparkDriver-akka.actor.default-dispatcher-3] storage.MemoryStore (Logging.scala:logInfo(59)) - ensureFreeSpace(1739) called with curMem=13288, maxMem=257918238
- 2015-01-21 19:00:22,220 INFO [sparkDriver-akka.actor.default-dispatcher-3] storage.MemoryStore (Logging.scala:logInfo(59)) - Block broadcast_5_piece0 stored as bytes in memory (estimated size 1739.0 B, free 246.0 MB)
- 2015-01-21 19:00:22,220 INFO [sparkDriver-akka.actor.default-dispatcher-13] storage.BlockManagerInfo (Logging.scala:logInfo(59)) - Added broadcast_5_piece0 in memory on ip-172-31-49-145.ec2.internal:53512 (size: 1739.0 B, free: 246.0 MB)
- 2015-01-21 19:00:22,221 INFO [sparkDriver-akka.actor.default-dispatcher-3] storage.BlockManagerMaster (Logging.scala:logInfo(59)) - Updated info of block broadcast_5_piece0
- 2015-01-21 19:00:22,221 INFO [sparkDriver-akka.actor.default-dispatcher-3] spark.SparkContext (Logging.scala:logInfo(59)) - Created broadcast 5 from broadcast at DAGScheduler.scala:838
- 2015-01-21 19:00:22,222 INFO [sparkDriver-akka.actor.default-dispatcher-3] scheduler.DAGScheduler (Logging.scala:logInfo(59)) - Submitting 2 missing tasks from Stage 5 (UnionRDD[19] at union at <console>:12)
- 2015-01-21 19:00:22,222 INFO [sparkDriver-akka.actor.default-dispatcher-3] cluster.YarnClientClusterScheduler (Logging.scala:logInfo(59)) - Adding task set 5.0 with 2 tasks
- 2015-01-21 19:00:22,223 INFO [Spark Context Cleaner] spark.ContextCleaner (Logging.scala:logInfo(59)) - Cleaned broadcast 4
- 2015-01-21 19:00:22,224 INFO [sparkDriver-akka.actor.default-dispatcher-16] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Starting task 0.0 in stage 5.0 (TID 34, ip-172-31-60-199.ec2.internal, PROCESS_LOCAL, 1369 bytes)
- 2015-01-21 19:00:22,225 INFO [sparkDriver-akka.actor.default-dispatcher-16] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Starting task 1.0 in stage 5.0 (TID 35, ip-172-31-60-198.ec2.internal, PROCESS_LOCAL, 1369 bytes)
- 2015-01-21 19:00:22,227 INFO [sparkDriver-akka.actor.default-dispatcher-16] storage.BlockManagerInfo (Logging.scala:logInfo(59)) - Removed broadcast_3_piece0 on ip-172-31-60-198.ec2.internal:32912 in memory (size: 1747.0 B, free: 535.0 MB)
- 2015-01-21 19:00:22,224 INFO [sparkDriver-akka.actor.default-dispatcher-2] storage.BlockManager (Logging.scala:logInfo(59)) - Removing broadcast 3
- 2015-01-21 19:00:22,228 INFO [sparkDriver-akka.actor.default-dispatcher-2] storage.BlockManager (Logging.scala:logInfo(59)) - Removing block broadcast_3_piece0
- 2015-01-21 19:00:22,228 INFO [sparkDriver-akka.actor.default-dispatcher-16] storage.BlockManagerInfo (Logging.scala:logInfo(59)) - Removed broadcast_3_piece0 on ip-172-31-60-199.ec2.internal:57517 in memory (size: 1747.0 B, free: 535.0 MB)
- 2015-01-21 19:00:22,228 INFO [sparkDriver-akka.actor.default-dispatcher-2] storage.MemoryStore (Logging.scala:logInfo(59)) - Block broadcast_3_piece0 of size 1747 dropped from memory (free 257904958)
- 2015-01-21 19:00:22,229 INFO [sparkDriver-akka.actor.default-dispatcher-16] storage.BlockManagerInfo (Logging.scala:logInfo(59)) - Removed broadcast_3_piece0 on ip-172-31-49-145.ec2.internal:53512 in memory (size: 1747.0 B, free: 246.0 MB)
- 2015-01-21 19:00:22,229 INFO [sparkDriver-akka.actor.default-dispatcher-2] storage.BlockManagerMaster (Logging.scala:logInfo(59)) - Updated info of block broadcast_3_piece0
- 2015-01-21 19:00:22,229 INFO [sparkDriver-akka.actor.default-dispatcher-2] storage.BlockManager (Logging.scala:logInfo(59)) - Removing block broadcast_3
- 2015-01-21 19:00:22,229 INFO [sparkDriver-akka.actor.default-dispatcher-2] storage.MemoryStore (Logging.scala:logInfo(59)) - Block broadcast_3 of size 2848 dropped from memory (free 257907806)
- 2015-01-21 19:00:22,233 INFO [Spark Context Cleaner] spark.ContextCleaner (Logging.scala:logInfo(59)) - Cleaned broadcast 3
- 2015-01-21 19:00:22,239 INFO [sparkDriver-akka.actor.default-dispatcher-17] storage.BlockManagerInfo (Logging.scala:logInfo(59)) - Added broadcast_5_piece0 in memory on ip-172-31-60-198.ec2.internal:32912 (size: 1739.0 B, free: 535.0 MB)
- 2015-01-21 19:00:22,240 INFO [sparkDriver-akka.actor.default-dispatcher-17] storage.BlockManagerInfo (Logging.scala:logInfo(59)) - Added broadcast_5_piece0 in memory on ip-172-31-60-199.ec2.internal:57517 (size: 1739.0 B, free: 535.0 MB)
- 2015-01-21 19:00:22,269 WARN [task-result-getter-3] scheduler.TaskSetManager (Logging.scala:logWarning(71)) - Lost task 1.0 in stage 5.0 (TID 35, ip-172-31-60-198.ec2.internal): java.lang.NullPointerException
- at $line26.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$1.apply(<console>:12)
- at $line26.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$1.apply(<console>:12)
- at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
- at scala.collection.Iterator$class.foreach(Iterator.scala:727)
- at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
- at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
- at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
- at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
- at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
- at scala.collection.AbstractIterator.to(Iterator.scala:1157)
- at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
- at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
- at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
- at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
- at org.apache.spark.rdd.RDD$$anonfun$16.apply(RDD.scala:780)
- at org.apache.spark.rdd.RDD$$anonfun$16.apply(RDD.scala:780)
- at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1314)
- at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1314)
- at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
- at org.apache.spark.scheduler.Task.run(Task.scala:56)
- at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:196)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
- at java.lang.Thread.run(Thread.java:745)
- 2015-01-21 19:00:22,270 INFO [sparkDriver-akka.actor.default-dispatcher-17] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Starting task 1.1 in stage 5.0 (TID 36, ip-172-31-60-199.ec2.internal, PROCESS_LOCAL, 1369 bytes)
- 2015-01-21 19:00:22,271 INFO [task-result-getter-2] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Lost task 0.0 in stage 5.0 (TID 34) on executor ip-172-31-60-199.ec2.internal: java.lang.NullPointerException (null) [duplicate 1]
- 2015-01-21 19:00:22,272 INFO [sparkDriver-akka.actor.default-dispatcher-17] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Starting task 0.1 in stage 5.0 (TID 37, ip-172-31-60-198.ec2.internal, PROCESS_LOCAL, 1369 bytes)
- 2015-01-21 19:00:22,283 INFO [task-result-getter-0] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Lost task 1.1 in stage 5.0 (TID 36) on executor ip-172-31-60-199.ec2.internal: java.lang.NullPointerException (null) [duplicate 2]
- 2015-01-21 19:00:22,285 INFO [sparkDriver-akka.actor.default-dispatcher-17] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Starting task 1.2 in stage 5.0 (TID 38, ip-172-31-60-199.ec2.internal, PROCESS_LOCAL, 1369 bytes)
- 2015-01-21 19:00:22,285 INFO [task-result-getter-1] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Lost task 0.1 in stage 5.0 (TID 37) on executor ip-172-31-60-198.ec2.internal: java.lang.NullPointerException (null) [duplicate 3]
- 2015-01-21 19:00:22,286 INFO [sparkDriver-akka.actor.default-dispatcher-14] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Starting task 0.2 in stage 5.0 (TID 39, ip-172-31-60-198.ec2.internal, PROCESS_LOCAL, 1369 bytes)
- 2015-01-21 19:00:22,298 INFO [task-result-getter-3] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Lost task 0.2 in stage 5.0 (TID 39) on executor ip-172-31-60-198.ec2.internal: java.lang.NullPointerException (null) [duplicate 4]
- 2015-01-21 19:00:22,298 INFO [sparkDriver-akka.actor.default-dispatcher-16] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Starting task 0.3 in stage 5.0 (TID 40, ip-172-31-60-198.ec2.internal, PROCESS_LOCAL, 1369 bytes)
- 2015-01-21 19:00:22,299 INFO [task-result-getter-2] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Lost task 1.2 in stage 5.0 (TID 38) on executor ip-172-31-60-199.ec2.internal: java.lang.NullPointerException (null) [duplicate 5]
- 2015-01-21 19:00:22,300 INFO [sparkDriver-akka.actor.default-dispatcher-16] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Starting task 1.3 in stage 5.0 (TID 41, ip-172-31-60-199.ec2.internal, PROCESS_LOCAL, 1369 bytes)
- 2015-01-21 19:00:22,313 INFO [task-result-getter-0] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Lost task 0.3 in stage 5.0 (TID 40) on executor ip-172-31-60-198.ec2.internal: java.lang.NullPointerException (null) [duplicate 6]
- 2015-01-21 19:00:22,313 ERROR [task-result-getter-0] scheduler.TaskSetManager (Logging.scala:logError(75)) - Task 0 in stage 5.0 failed 4 times; aborting job
- 2015-01-21 19:00:22,314 INFO [task-result-getter-0] cluster.YarnClientClusterScheduler (Logging.scala:logInfo(59)) - Removed TaskSet 5.0, whose tasks have all completed, from pool
- 2015-01-21 19:00:22,314 INFO [task-result-getter-1] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Lost task 1.3 in stage 5.0 (TID 41) on executor ip-172-31-60-199.ec2.internal: java.lang.NullPointerException (null) [duplicate 7]
- 2015-01-21 19:00:22,314 INFO [task-result-getter-1] cluster.YarnClientClusterScheduler (Logging.scala:logInfo(59)) - Removed TaskSet 5.0, whose tasks have all completed, from pool
- 2015-01-21 19:00:22,315 INFO [sparkDriver-akka.actor.default-dispatcher-17] cluster.YarnClientClusterScheduler (Logging.scala:logInfo(59)) - Cancelling stage 5
- 2015-01-21 19:00:22,315 INFO [main] scheduler.DAGScheduler (Logging.scala:logInfo(59)) - Job 5 failed: collect at <console>:12, took 0.101138 s
- org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 5.0 failed 4 times, most recent failure: Lost task 0.3 in stage 5.0 (TID 40, ip-172-31-60-198.ec2.internal): java.lang.NullPointerException
- at $iwC$$iwC$$iwC$$iwC$$anonfun$1.apply(<console>:12)
- at $iwC$$iwC$$iwC$$iwC$$anonfun$1.apply(<console>:12)
- at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
- at scala.collection.Iterator$class.foreach(Iterator.scala:727)
- at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
- at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
- at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
- at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
- at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
- at scala.collection.AbstractIterator.to(Iterator.scala:1157)
- at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
- at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
- at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
- at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
- at org.apache.spark.rdd.RDD$$anonfun$16.apply(RDD.scala:780)
- at org.apache.spark.rdd.RDD$$anonfun$16.apply(RDD.scala:780)
- at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1314)
- at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1314)
- at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
- at org.apache.spark.scheduler.Task.run(Task.scala:56)
- at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:196)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
- at java.lang.Thread.run(Thread.java:745)
- Driver stacktrace:
- at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1214)
- at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1203)
- at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1202)
- at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
- at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
- at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1202)
- at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:696)
- at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:696)
- at scala.Option.foreach(Option.scala:236)
- at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:696)
- at org.apache.spark.scheduler.DAGSchedulerEventProcessActor$$anonfun$receive$2.applyOrElse(DAGScheduler.scala:1420)
- at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
- at org.apache.spark.scheduler.DAGSchedulerEventProcessActor.aroundReceive(DAGScheduler.scala:1375)
- at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
- at akka.actor.ActorCell.invoke(ActorCell.scala:487)
- at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
- at akka.dispatch.Mailbox.run(Mailbox.scala:220)
- at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
- at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
- at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
- at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
- at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement