Advertisement
Guest User

Untitled

a guest
Jan 21st, 2015
610
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 16.77 KB | None | 0 0
  1. 2015-01-21 19:00:22,211 INFO [sparkDriver-akka.actor.default-dispatcher-3] storage.BlockManager (Logging.scala:logInfo(59)) - Removing broadcast 4
  2. 2015-01-21 19:00:22,211 INFO [sparkDriver-akka.actor.default-dispatcher-3] storage.BlockManager (Logging.scala:logInfo(59)) - Removing block broadcast_4
  3. 2015-01-21 19:00:22,211 INFO [sparkDriver-akka.actor.default-dispatcher-3] storage.MemoryStore (Logging.scala:logInfo(59)) - Block broadcast_4 of size 2856 dropped from memory (free 257906076)
  4. 2015-01-21 19:00:22,211 INFO [sparkDriver-akka.actor.default-dispatcher-3] storage.BlockManager (Logging.scala:logInfo(59)) - Removing block broadcast_4_piece0
  5. 2015-01-21 19:00:22,211 INFO [sparkDriver-akka.actor.default-dispatcher-3] storage.MemoryStore (Logging.scala:logInfo(59)) - Block broadcast_4_piece0 of size 1730 dropped from memory (free 257907806)
  6. 2015-01-21 19:00:22,212 INFO [sparkDriver-akka.actor.default-dispatcher-2] storage.BlockManagerInfo (Logging.scala:logInfo(59)) - Removed broadcast_4_piece0 on ip-172-31-49-145.ec2.internal:53512 in memory (size: 1730.0 B, free: 246.0 MB)
  7. 2015-01-21 19:00:22,212 INFO [sparkDriver-akka.actor.default-dispatcher-3] storage.BlockManagerMaster (Logging.scala:logInfo(59)) - Updated info of block broadcast_4_piece0
  8. 2015-01-21 19:00:22,213 INFO [main] spark.SparkContext (Logging.scala:logInfo(59)) - Starting job: collect at <console>:12
  9. 2015-01-21 19:00:22,214 INFO [sparkDriver-akka.actor.default-dispatcher-3] scheduler.DAGScheduler (Logging.scala:logInfo(59)) - Got job 5 (collect at <console>:12) with 2 output partitions (allowLocal=false)
  10. 2015-01-21 19:00:22,214 INFO [sparkDriver-akka.actor.default-dispatcher-3] scheduler.DAGScheduler (Logging.scala:logInfo(59)) - Final stage: Stage 5(collect at <console>:12)
  11. 2015-01-21 19:00:22,214 INFO [sparkDriver-akka.actor.default-dispatcher-3] scheduler.DAGScheduler (Logging.scala:logInfo(59)) - Parents of final stage: List()
  12. 2015-01-21 19:00:22,215 INFO [sparkDriver-akka.actor.default-dispatcher-3] scheduler.DAGScheduler (Logging.scala:logInfo(59)) - Missing parents: List()
  13. 2015-01-21 19:00:22,216 INFO [sparkDriver-akka.actor.default-dispatcher-3] scheduler.DAGScheduler (Logging.scala:logInfo(59)) - Submitting Stage 5 (UnionRDD[19] at union at <console>:12), which has no missing parents
  14. 2015-01-21 19:00:22,217 INFO [sparkDriver-akka.actor.default-dispatcher-3] storage.MemoryStore (Logging.scala:logInfo(59)) - ensureFreeSpace(2856) called with curMem=10432, maxMem=257918238
  15. 2015-01-21 19:00:22,218 INFO [sparkDriver-akka.actor.default-dispatcher-3] storage.MemoryStore (Logging.scala:logInfo(59)) - Block broadcast_5 stored as values in memory (estimated size 2.8 KB, free 246.0 MB)
  16. 2015-01-21 19:00:22,218 INFO [sparkDriver-akka.actor.default-dispatcher-13] storage.BlockManagerInfo (Logging.scala:logInfo(59)) - Removed broadcast_4_piece0 on ip-172-31-60-199.ec2.internal:57517 in memory (size: 1730.0 B, free: 535.0 MB)
  17. 2015-01-21 19:00:22,219 INFO [sparkDriver-akka.actor.default-dispatcher-13] storage.BlockManagerInfo (Logging.scala:logInfo(59)) - Removed broadcast_4_piece0 on ip-172-31-60-198.ec2.internal:32912 in memory (size: 1730.0 B, free: 535.0 MB)
  18. 2015-01-21 19:00:22,219 INFO [sparkDriver-akka.actor.default-dispatcher-3] storage.MemoryStore (Logging.scala:logInfo(59)) - ensureFreeSpace(1739) called with curMem=13288, maxMem=257918238
  19. 2015-01-21 19:00:22,220 INFO [sparkDriver-akka.actor.default-dispatcher-3] storage.MemoryStore (Logging.scala:logInfo(59)) - Block broadcast_5_piece0 stored as bytes in memory (estimated size 1739.0 B, free 246.0 MB)
  20. 2015-01-21 19:00:22,220 INFO [sparkDriver-akka.actor.default-dispatcher-13] storage.BlockManagerInfo (Logging.scala:logInfo(59)) - Added broadcast_5_piece0 in memory on ip-172-31-49-145.ec2.internal:53512 (size: 1739.0 B, free: 246.0 MB)
  21. 2015-01-21 19:00:22,221 INFO [sparkDriver-akka.actor.default-dispatcher-3] storage.BlockManagerMaster (Logging.scala:logInfo(59)) - Updated info of block broadcast_5_piece0
  22. 2015-01-21 19:00:22,221 INFO [sparkDriver-akka.actor.default-dispatcher-3] spark.SparkContext (Logging.scala:logInfo(59)) - Created broadcast 5 from broadcast at DAGScheduler.scala:838
  23. 2015-01-21 19:00:22,222 INFO [sparkDriver-akka.actor.default-dispatcher-3] scheduler.DAGScheduler (Logging.scala:logInfo(59)) - Submitting 2 missing tasks from Stage 5 (UnionRDD[19] at union at <console>:12)
  24. 2015-01-21 19:00:22,222 INFO [sparkDriver-akka.actor.default-dispatcher-3] cluster.YarnClientClusterScheduler (Logging.scala:logInfo(59)) - Adding task set 5.0 with 2 tasks
  25. 2015-01-21 19:00:22,223 INFO [Spark Context Cleaner] spark.ContextCleaner (Logging.scala:logInfo(59)) - Cleaned broadcast 4
  26. 2015-01-21 19:00:22,224 INFO [sparkDriver-akka.actor.default-dispatcher-16] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Starting task 0.0 in stage 5.0 (TID 34, ip-172-31-60-199.ec2.internal, PROCESS_LOCAL, 1369 bytes)
  27. 2015-01-21 19:00:22,225 INFO [sparkDriver-akka.actor.default-dispatcher-16] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Starting task 1.0 in stage 5.0 (TID 35, ip-172-31-60-198.ec2.internal, PROCESS_LOCAL, 1369 bytes)
  28. 2015-01-21 19:00:22,227 INFO [sparkDriver-akka.actor.default-dispatcher-16] storage.BlockManagerInfo (Logging.scala:logInfo(59)) - Removed broadcast_3_piece0 on ip-172-31-60-198.ec2.internal:32912 in memory (size: 1747.0 B, free: 535.0 MB)
  29. 2015-01-21 19:00:22,224 INFO [sparkDriver-akka.actor.default-dispatcher-2] storage.BlockManager (Logging.scala:logInfo(59)) - Removing broadcast 3
  30. 2015-01-21 19:00:22,228 INFO [sparkDriver-akka.actor.default-dispatcher-2] storage.BlockManager (Logging.scala:logInfo(59)) - Removing block broadcast_3_piece0
  31. 2015-01-21 19:00:22,228 INFO [sparkDriver-akka.actor.default-dispatcher-16] storage.BlockManagerInfo (Logging.scala:logInfo(59)) - Removed broadcast_3_piece0 on ip-172-31-60-199.ec2.internal:57517 in memory (size: 1747.0 B, free: 535.0 MB)
  32. 2015-01-21 19:00:22,228 INFO [sparkDriver-akka.actor.default-dispatcher-2] storage.MemoryStore (Logging.scala:logInfo(59)) - Block broadcast_3_piece0 of size 1747 dropped from memory (free 257904958)
  33. 2015-01-21 19:00:22,229 INFO [sparkDriver-akka.actor.default-dispatcher-16] storage.BlockManagerInfo (Logging.scala:logInfo(59)) - Removed broadcast_3_piece0 on ip-172-31-49-145.ec2.internal:53512 in memory (size: 1747.0 B, free: 246.0 MB)
  34. 2015-01-21 19:00:22,229 INFO [sparkDriver-akka.actor.default-dispatcher-2] storage.BlockManagerMaster (Logging.scala:logInfo(59)) - Updated info of block broadcast_3_piece0
  35. 2015-01-21 19:00:22,229 INFO [sparkDriver-akka.actor.default-dispatcher-2] storage.BlockManager (Logging.scala:logInfo(59)) - Removing block broadcast_3
  36. 2015-01-21 19:00:22,229 INFO [sparkDriver-akka.actor.default-dispatcher-2] storage.MemoryStore (Logging.scala:logInfo(59)) - Block broadcast_3 of size 2848 dropped from memory (free 257907806)
  37. 2015-01-21 19:00:22,233 INFO [Spark Context Cleaner] spark.ContextCleaner (Logging.scala:logInfo(59)) - Cleaned broadcast 3
  38. 2015-01-21 19:00:22,239 INFO [sparkDriver-akka.actor.default-dispatcher-17] storage.BlockManagerInfo (Logging.scala:logInfo(59)) - Added broadcast_5_piece0 in memory on ip-172-31-60-198.ec2.internal:32912 (size: 1739.0 B, free: 535.0 MB)
  39. 2015-01-21 19:00:22,240 INFO [sparkDriver-akka.actor.default-dispatcher-17] storage.BlockManagerInfo (Logging.scala:logInfo(59)) - Added broadcast_5_piece0 in memory on ip-172-31-60-199.ec2.internal:57517 (size: 1739.0 B, free: 535.0 MB)
  40. 2015-01-21 19:00:22,269 WARN [task-result-getter-3] scheduler.TaskSetManager (Logging.scala:logWarning(71)) - Lost task 1.0 in stage 5.0 (TID 35, ip-172-31-60-198.ec2.internal): java.lang.NullPointerException
  41. at $line26.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$1.apply(<console>:12)
  42. at $line26.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$1.apply(<console>:12)
  43. at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
  44. at scala.collection.Iterator$class.foreach(Iterator.scala:727)
  45. at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
  46. at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
  47. at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
  48. at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
  49. at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
  50. at scala.collection.AbstractIterator.to(Iterator.scala:1157)
  51. at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
  52. at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
  53. at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
  54. at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
  55. at org.apache.spark.rdd.RDD$$anonfun$16.apply(RDD.scala:780)
  56. at org.apache.spark.rdd.RDD$$anonfun$16.apply(RDD.scala:780)
  57. at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1314)
  58. at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1314)
  59. at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
  60. at org.apache.spark.scheduler.Task.run(Task.scala:56)
  61. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:196)
  62. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
  63. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
  64. at java.lang.Thread.run(Thread.java:745)
  65.  
  66. 2015-01-21 19:00:22,270 INFO [sparkDriver-akka.actor.default-dispatcher-17] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Starting task 1.1 in stage 5.0 (TID 36, ip-172-31-60-199.ec2.internal, PROCESS_LOCAL, 1369 bytes)
  67. 2015-01-21 19:00:22,271 INFO [task-result-getter-2] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Lost task 0.0 in stage 5.0 (TID 34) on executor ip-172-31-60-199.ec2.internal: java.lang.NullPointerException (null) [duplicate 1]
  68. 2015-01-21 19:00:22,272 INFO [sparkDriver-akka.actor.default-dispatcher-17] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Starting task 0.1 in stage 5.0 (TID 37, ip-172-31-60-198.ec2.internal, PROCESS_LOCAL, 1369 bytes)
  69. 2015-01-21 19:00:22,283 INFO [task-result-getter-0] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Lost task 1.1 in stage 5.0 (TID 36) on executor ip-172-31-60-199.ec2.internal: java.lang.NullPointerException (null) [duplicate 2]
  70. 2015-01-21 19:00:22,285 INFO [sparkDriver-akka.actor.default-dispatcher-17] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Starting task 1.2 in stage 5.0 (TID 38, ip-172-31-60-199.ec2.internal, PROCESS_LOCAL, 1369 bytes)
  71. 2015-01-21 19:00:22,285 INFO [task-result-getter-1] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Lost task 0.1 in stage 5.0 (TID 37) on executor ip-172-31-60-198.ec2.internal: java.lang.NullPointerException (null) [duplicate 3]
  72. 2015-01-21 19:00:22,286 INFO [sparkDriver-akka.actor.default-dispatcher-14] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Starting task 0.2 in stage 5.0 (TID 39, ip-172-31-60-198.ec2.internal, PROCESS_LOCAL, 1369 bytes)
  73. 2015-01-21 19:00:22,298 INFO [task-result-getter-3] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Lost task 0.2 in stage 5.0 (TID 39) on executor ip-172-31-60-198.ec2.internal: java.lang.NullPointerException (null) [duplicate 4]
  74. 2015-01-21 19:00:22,298 INFO [sparkDriver-akka.actor.default-dispatcher-16] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Starting task 0.3 in stage 5.0 (TID 40, ip-172-31-60-198.ec2.internal, PROCESS_LOCAL, 1369 bytes)
  75. 2015-01-21 19:00:22,299 INFO [task-result-getter-2] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Lost task 1.2 in stage 5.0 (TID 38) on executor ip-172-31-60-199.ec2.internal: java.lang.NullPointerException (null) [duplicate 5]
  76. 2015-01-21 19:00:22,300 INFO [sparkDriver-akka.actor.default-dispatcher-16] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Starting task 1.3 in stage 5.0 (TID 41, ip-172-31-60-199.ec2.internal, PROCESS_LOCAL, 1369 bytes)
  77. 2015-01-21 19:00:22,313 INFO [task-result-getter-0] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Lost task 0.3 in stage 5.0 (TID 40) on executor ip-172-31-60-198.ec2.internal: java.lang.NullPointerException (null) [duplicate 6]
  78. 2015-01-21 19:00:22,313 ERROR [task-result-getter-0] scheduler.TaskSetManager (Logging.scala:logError(75)) - Task 0 in stage 5.0 failed 4 times; aborting job
  79. 2015-01-21 19:00:22,314 INFO [task-result-getter-0] cluster.YarnClientClusterScheduler (Logging.scala:logInfo(59)) - Removed TaskSet 5.0, whose tasks have all completed, from pool
  80. 2015-01-21 19:00:22,314 INFO [task-result-getter-1] scheduler.TaskSetManager (Logging.scala:logInfo(59)) - Lost task 1.3 in stage 5.0 (TID 41) on executor ip-172-31-60-199.ec2.internal: java.lang.NullPointerException (null) [duplicate 7]
  81. 2015-01-21 19:00:22,314 INFO [task-result-getter-1] cluster.YarnClientClusterScheduler (Logging.scala:logInfo(59)) - Removed TaskSet 5.0, whose tasks have all completed, from pool
  82. 2015-01-21 19:00:22,315 INFO [sparkDriver-akka.actor.default-dispatcher-17] cluster.YarnClientClusterScheduler (Logging.scala:logInfo(59)) - Cancelling stage 5
  83. 2015-01-21 19:00:22,315 INFO [main] scheduler.DAGScheduler (Logging.scala:logInfo(59)) - Job 5 failed: collect at <console>:12, took 0.101138 s
  84. org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 5.0 failed 4 times, most recent failure: Lost task 0.3 in stage 5.0 (TID 40, ip-172-31-60-198.ec2.internal): java.lang.NullPointerException
  85. at $iwC$$iwC$$iwC$$iwC$$anonfun$1.apply(<console>:12)
  86. at $iwC$$iwC$$iwC$$iwC$$anonfun$1.apply(<console>:12)
  87. at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
  88. at scala.collection.Iterator$class.foreach(Iterator.scala:727)
  89. at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
  90. at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
  91. at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
  92. at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
  93. at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
  94. at scala.collection.AbstractIterator.to(Iterator.scala:1157)
  95. at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
  96. at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
  97. at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
  98. at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
  99. at org.apache.spark.rdd.RDD$$anonfun$16.apply(RDD.scala:780)
  100. at org.apache.spark.rdd.RDD$$anonfun$16.apply(RDD.scala:780)
  101. at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1314)
  102. at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1314)
  103. at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
  104. at org.apache.spark.scheduler.Task.run(Task.scala:56)
  105. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:196)
  106. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
  107. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
  108. at java.lang.Thread.run(Thread.java:745)
  109.  
  110. Driver stacktrace:
  111. at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1214)
  112. at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1203)
  113. at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1202)
  114. at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
  115. at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
  116. at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1202)
  117. at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:696)
  118. at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:696)
  119. at scala.Option.foreach(Option.scala:236)
  120. at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:696)
  121. at org.apache.spark.scheduler.DAGSchedulerEventProcessActor$$anonfun$receive$2.applyOrElse(DAGScheduler.scala:1420)
  122. at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
  123. at org.apache.spark.scheduler.DAGSchedulerEventProcessActor.aroundReceive(DAGScheduler.scala:1375)
  124. at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
  125. at akka.actor.ActorCell.invoke(ActorCell.scala:487)
  126. at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
  127. at akka.dispatch.Mailbox.run(Mailbox.scala:220)
  128. at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
  129. at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
  130. at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
  131. at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
  132. at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement