Guest User

Untitled

a guest
Sep 27th, 2016
57
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 16.92 KB | None | 0 0
  1. 16/09/20 09:23:49 INFO SparkContext: Created broadcast 4 from broadcast at DAGScheduler.scala:1006
  2. 16/09/20 09:23:49 INFO DAGScheduler: Submitting 4 missing tasks from ShuffleMapStage 6 (MapPartitionsRDD[6] at repartition at mainBugTest.scala:54)
  3. 16/09/20 09:23:49 INFO TaskSchedulerImpl: Adding task set 6.0 with 4 tasks
  4. 16/09/20 09:23:49 INFO TaskSetManager: Starting task 0.0 in stage 6.0 (TID 11, localhost, partition 0,NODE_LOCAL, 2071 bytes)
  5. 16/09/20 09:23:49 INFO TaskSetManager: Starting task 1.0 in stage 6.0 (TID 12, localhost, partition 1,NODE_LOCAL, 2071 bytes)
  6. 16/09/20 09:23:49 INFO TaskSetManager: Starting task 2.0 in stage 6.0 (TID 13, localhost, partition 2,NODE_LOCAL, 2071 bytes)
  7. 16/09/20 09:23:49 INFO Executor: Running task 2.0 in stage 6.0 (TID 13)
  8. 16/09/20 09:23:49 INFO Executor: Running task 1.0 in stage 6.0 (TID 12)
  9. 16/09/20 09:23:49 INFO Executor: Running task 0.0 in stage 6.0 (TID 11)
  10. 16/09/20 09:23:49 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty blocks out of 3 blocks
  11. 16/09/20 09:23:49 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty blocks out of 3 blocks
  12. 16/09/20 09:23:49 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
  13. 16/09/20 09:23:49 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty blocks out of 3 blocks
  14. 16/09/20 09:23:49 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
  15. 16/09/20 09:23:49 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
  16. 16/09/20 09:23:50 INFO BlockManagerInfo: Removed broadcast_3_piece0 on localhost:33326 in memory (size: 1789.0 B, free: 1105.9 MB)
  17. 16/09/20 09:23:50 INFO ContextCleaner: Cleaned accumulator 4
  18. 16/09/20 09:23:50 INFO BlockManagerInfo: Removed broadcast_2_piece0 on localhost:33326 in memory (size: 1840.0 B, free: 1105.9 MB)
  19. 16/09/20 09:23:54 INFO ContextCleaner: Cleaned accumulator 3
  20. 16/09/20 09:23:54 INFO BlockManagerInfo: Removed broadcast_1_piece0 on localhost:33326 in memory (size: 1580.0 B, free: 1105.9 MB)
  21. 16/09/20 09:23:54 INFO ContextCleaner: Cleaned accumulator 2
  22. 16/09/20 09:24:06 INFO ExternalSorter: Thread 66 spilling in-memory map of 373.6 MB to disk (1 time so far)
  23. 16/09/20 09:24:09 INFO ExternalSorter: Thread 65 spilling in-memory map of 373.6 MB to disk (1 time so far)
  24. 16/09/20 09:24:09 INFO ExternalSorter: Thread 64 spilling in-memory map of 373.6 MB to disk (1 time so far)
  25. 16/09/20 09:24:17 ERROR Executor: Exception in task 2.0 in stage 6.0 (TID 13)
  26. scala.MatchError: ((4c5b5fc8-6eb9-40b1-8e6c-a81e4c9869ce07fe38fc-abf2-43b0-b618-ff898cffbad6,1.0),null) (of class scala.Tuple2)
  27. at mainBugTest$$anonfun$mainBugTest$$mergeMaps$1$1.apply(mainBugTest.scala:41)
  28. at mainBugTest$$anonfun$mainBugTest$$mergeMaps$1$1.apply(mainBugTest.scala:41)
  29. at scala.collection.immutable.HashMap$$anon$2.apply(HashMap.scala:148)
  30. at scala.collection.immutable.HashMap$HashMap1.updated0(HashMap.scala:200)
  31. at scala.collection.immutable.HashMap$HashTrieMap.updated0(HashMap.scala:322)
  32. at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:463)
  33. at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:488)
  34. at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:488)
  35. at scala.collection.immutable.HashMap.merged(HashMap.scala:117)
  36. at mainBugTest$.mainBugTest$$mergeMaps$1(mainBugTest.scala:41)
  37. at mainBugTest$$anonfun$7.apply(mainBugTest.scala:62)
  38. at mainBugTest$$anonfun$7.apply(mainBugTest.scala:62)
  39. at org.apache.spark.util.collection.ExternalSorter$$anon$3.next(ExternalSorter.scala:416)
  40. at org.apache.spark.util.collection.ExternalSorter$$anon$3.next(ExternalSorter.scala:391)
  41. at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
  42. at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
  43. at org.apache.spark.util.collection.ExternalSorter$$anonfun$writePartitionedFile$2.apply(ExternalSorter.scala:668)
  44. at org.apache.spark.util.collection.ExternalSorter$$anonfun$writePartitionedFile$2.apply(ExternalSorter.scala:667)
  45. at scala.collection.Iterator$class.foreach(Iterator.scala:893)
  46. at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
  47. at org.apache.spark.util.collection.ExternalSorter.writePartitionedFile(ExternalSorter.scala:667)
  48. at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:72)
  49. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
  50. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
  51. at org.apache.spark.scheduler.Task.run(Task.scala:89)
  52. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
  53. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
  54. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
  55. at java.lang.Thread.run(Thread.java:745)
  56. 16/09/20 09:24:17 INFO TaskSetManager: Starting task 3.0 in stage 6.0 (TID 14, localhost, partition 3,NODE_LOCAL, 2071 bytes)
  57. 16/09/20 09:24:17 WARN TaskSetManager: Lost task 2.0 in stage 6.0 (TID 13, localhost): scala.MatchError: ((4c5b5fc8-6eb9-40b1-8e6c-a81e4c9869ce07fe38fc-abf2-43b0-b618-ff898cffbad6,1.0),null) (of class scala.Tuple2)
  58. at mainBugTest$$anonfun$mainBugTest$$mergeMaps$1$1.apply(mainBugTest.scala:41)
  59. at mainBugTest$$anonfun$mainBugTest$$mergeMaps$1$1.apply(mainBugTest.scala:41)
  60. at scala.collection.immutable.HashMap$$anon$2.apply(HashMap.scala:148)
  61. at scala.collection.immutable.HashMap$HashMap1.updated0(HashMap.scala:200)
  62. at scala.collection.immutable.HashMap$HashTrieMap.updated0(HashMap.scala:322)
  63. at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:463)
  64. at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:488)
  65. at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:488)
  66. at scala.collection.immutable.HashMap.merged(HashMap.scala:117)
  67. at mainBugTest$.mainBugTest$$mergeMaps$1(mainBugTest.scala:41)
  68. at mainBugTest$$anonfun$7.apply(mainBugTest.scala:62)
  69. at mainBugTest$$anonfun$7.apply(mainBugTest.scala:62)
  70. at org.apache.spark.util.collection.ExternalSorter$$anon$3.next(ExternalSorter.scala:416)
  71. at org.apache.spark.util.collection.ExternalSorter$$anon$3.next(ExternalSorter.scala:391)
  72. at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
  73. at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
  74. at org.apache.spark.util.collection.ExternalSorter$$anonfun$writePartitionedFile$2.apply(ExternalSorter.scala:668)
  75. at org.apache.spark.util.collection.ExternalSorter$$anonfun$writePartitionedFile$2.apply(ExternalSorter.scala:667)
  76. at scala.collection.Iterator$class.foreach(Iterator.scala:893)
  77. at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
  78. at org.apache.spark.util.collection.ExternalSorter.writePartitionedFile(ExternalSorter.scala:667)
  79. at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:72)
  80. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
  81. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
  82. at org.apache.spark.scheduler.Task.run(Task.scala:89)
  83. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
  84. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
  85. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
  86. at java.lang.Thread.run(Thread.java:745)
  87.  
  88. 16/09/20 09:24:17 INFO Executor: Running task 3.0 in stage 6.0 (TID 14)
  89. 16/09/20 09:24:17 ERROR TaskSetManager: Task 2 in stage 6.0 failed 1 times; aborting job
  90. 16/09/20 09:24:17 INFO TaskSchedulerImpl: Cancelling stage 6
  91. 16/09/20 09:24:17 INFO Executor: Executor is trying to kill task 1.0 in stage 6.0 (TID 12)
  92. 16/09/20 09:24:17 INFO Executor: Executor is trying to kill task 3.0 in stage 6.0 (TID 14)
  93. 16/09/20 09:24:17 INFO TaskSchedulerImpl: Stage 6 was cancelled
  94. 16/09/20 09:24:17 INFO Executor: Executor is trying to kill task 0.0 in stage 6.0 (TID 11)
  95. 16/09/20 09:24:17 INFO DAGScheduler: ShuffleMapStage 6 (repartition at mainBugTest.scala:54) failed in 27.341 s
  96. 16/09/20 09:24:17 INFO DAGScheduler: Job 3 failed: take at mainBugTest.scala:65, took 27.360445 s
  97. Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 2 in stage 6.0 failed 1 times, most recent failure: Lost task 2.0 in stage 6.0 (TID 13, localhost): scala.MatchError: ((4c5b5fc8-6eb9-40b1-8e6c-a81e4c9869ce07fe38fc-abf2-43b0-b618-ff898cffbad6,1.0),null) (of class scala.Tuple2)
  98. at mainBugTest$$anonfun$mainBugTest$$mergeMaps$1$1.apply(mainBugTest.scala:41)
  99. at mainBugTest$$anonfun$mainBugTest$$mergeMaps$1$1.apply(mainBugTest.scala:41)
  100. at scala.collection.immutable.HashMap$$anon$2.apply(HashMap.scala:148)
  101. at scala.collection.immutable.HashMap$HashMap1.updated0(HashMap.scala:200)
  102. at scala.collection.immutable.HashMap$HashTrieMap.updated0(HashMap.scala:322)
  103. at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:463)
  104. at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:488)
  105. at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:488)
  106. at scala.collection.immutable.HashMap.merged(HashMap.scala:117)
  107. at mainBugTest$.mainBugTest$$mergeMaps$1(mainBugTest.scala:41)
  108. at mainBugTest$$anonfun$7.apply(mainBugTest.scala:62)
  109. at mainBugTest$$anonfun$7.apply(mainBugTest.scala:62)
  110. at org.apache.spark.util.collection.ExternalSorter$$anon$3.next(ExternalSorter.scala:416)
  111. at org.apache.spark.util.collection.ExternalSorter$$anon$3.next(ExternalSorter.scala:391)
  112. at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
  113. at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
  114. at org.apache.spark.util.collection.ExternalSorter$$anonfun$writePartitionedFile$2.apply(ExternalSorter.scala:668)
  115. at org.apache.spark.util.collection.ExternalSorter$$anonfun$writePartitionedFile$2.apply(ExternalSorter.scala:667)
  116. at scala.collection.Iterator$class.foreach(Iterator.scala:893)
  117. at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
  118. at org.apache.spark.util.collection.ExternalSorter.writePartitionedFile(ExternalSorter.scala:667)
  119. at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:72)
  120. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
  121. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
  122. at org.apache.spark.scheduler.Task.run(Task.scala:89)
  123. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
  124. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
  125. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
  126. at java.lang.Thread.run(Thread.java:745)
  127.  
  128. Driver stacktrace:
  129. at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)
  130. at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419)
  131. at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1418)
  132. at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
  133. at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
  134. at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1418)
  135. at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
  136. at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
  137. at scala.Option.foreach(Option.scala:257)
  138. at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:799)
  139. at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1640)
  140. at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1599)
  141. at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1588)
  142. at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
  143. at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)
  144. at org.apache.spark.SparkContext.runJob(SparkContext.scala:1832)
  145. at org.apache.spark.SparkContext.runJob(SparkContext.scala:1845)
  146. at org.apache.spark.SparkContext.runJob(SparkContext.scala:1858)
  147. at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1328)
  148. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  149. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
  150. at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
  151. at org.apache.spark.rdd.RDD.take(RDD.scala:1302)
  152. at mainBugTest$.main(mainBugTest.scala:65)
  153. at mainBugTest.main(mainBugTest.scala)
  154. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  155. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
  156. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  157. at java.lang.reflect.Method.invoke(Method.java:606)
  158. at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
  159. Caused by: scala.MatchError: ((4c5b5fc8-6eb9-40b1-8e6c-a81e4c9869ce07fe38fc-abf2-43b0-b618-ff898cffbad6,1.0),null) (of class scala.Tuple2)
  160. at mainBugTest$$anonfun$mainBugTest$$mergeMaps$1$1.apply(mainBugTest.scala:41)
  161. at mainBugTest$$anonfun$mainBugTest$$mergeMaps$1$1.apply(mainBugTest.scala:41)
  162. at scala.collection.immutable.HashMap$$anon$2.apply(HashMap.scala:148)
  163. at scala.collection.immutable.HashMap$HashMap1.updated0(HashMap.scala:200)
  164. at scala.collection.immutable.HashMap$HashTrieMap.updated0(HashMap.scala:322)
  165. at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:463)
  166. at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:488)
  167. at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:488)
  168. at scala.collection.immutable.HashMap.merged(HashMap.scala:117)
  169. at mainBugTest$.mainBugTest$$mergeMaps$1(mainBugTest.scala:41)
  170. at mainBugTest$$anonfun$7.apply(mainBugTest.scala:62)
  171. at mainBugTest$$anonfun$7.apply(mainBugTest.scala:62)
  172. at org.apache.spark.util.collection.ExternalSorter$$anon$3.next(ExternalSorter.scala:416)
  173. at org.apache.spark.util.collection.ExternalSorter$$anon$3.next(ExternalSorter.scala:391)
  174. at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
  175. at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
  176. at org.apache.spark.util.collection.ExternalSorter$$anonfun$writePartitionedFile$2.apply(ExternalSorter.scala:668)
  177. at org.apache.spark.util.collection.ExternalSorter$$anonfun$writePartitionedFile$2.apply(ExternalSorter.scala:667)
  178. at scala.collection.Iterator$class.foreach(Iterator.scala:893)
  179. at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
  180. at org.apache.spark.util.collection.ExternalSorter.writePartitionedFile(ExternalSorter.scala:667)
  181. at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:72)
  182. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
  183. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
  184. at org.apache.spark.scheduler.Task.run(Task.scala:89)
  185. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
  186. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
  187. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
  188. at java.lang.Thread.run(Thread.java:745)
  189. 16/09/20 09:24:17 INFO Executor: Executor killed task 0.0 in stage 6.0 (TID 11)
  190. 16/09/20 09:24:17 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty blocks out of 3 blocks
  191. 16/09/20 09:24:17 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 5 ms
  192. 16/09/20 09:24:17 INFO SparkContext: Invoking stop() from shutdown hook
  193. 16/09/20 09:24:17 WARN TaskSetManager: Lost task 0.0 in stage 6.0 (TID 11, localhost): TaskKilled (killed intentionally)
  194. 16/09/20 09:24:17 INFO Executor: Executor killed task 1.0 in stage 6.0 (TID 12)
  195. 16/09/20 09:24:17 WARN TaskSetManager: Lost task 1.0 in stage 6.0 (TID 12, localhost): TaskKilled (killed intentionally)
  196. 16/09/20 09:24:17 INFO Executor: Executor killed task 3.0 in stage 6.0 (TID 14)
  197. 16/09/20 09:24:17 WARN TaskSetManager: Lost task 3.0 in stage 6.0 (TID 14, localhost): TaskKilled (killed intentionally)
  198. 16/09/20 09:24:17 INFO TaskSchedulerImpl: Removed TaskSet 6.0, whose tasks have all completed, from pool
  199. 16/09/20 09:24:17 INFO SparkUI: Stopped Spark web UI at http://10.0.2.15:4040
  200. 16/09/20 09:24:17 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
  201. 16/09/20 09:24:17 INFO MemoryStore: MemoryStore cleared
  202. 16/09/20 09:24:17 INFO BlockManager: BlockManager stopped
  203. 16/09/20 09:24:17 INFO BlockManagerMaster: BlockManagerMaster stopped
  204. 16/09/20 09:24:17 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
  205. 16/09/20 09:24:17 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
  206. 16/09/20 09:24:17 INFO SparkContext: Successfully stopped SparkContext
  207. 16/09/20 09:24:17 INFO ShutdownHookManager: Shutdown hook called
  208. 16/09/20 09:24:17 INFO ShutdownHookManager: Deleting directory /tmp/spark-70165989-f68f-428b-9c74-28608e2dc630
Add Comment
Please, Sign In to add comment