Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- 16/09/20 09:23:49 INFO SparkContext: Created broadcast 4 from broadcast at DAGScheduler.scala:1006
- 16/09/20 09:23:49 INFO DAGScheduler: Submitting 4 missing tasks from ShuffleMapStage 6 (MapPartitionsRDD[6] at repartition at mainBugTest.scala:54)
- 16/09/20 09:23:49 INFO TaskSchedulerImpl: Adding task set 6.0 with 4 tasks
- 16/09/20 09:23:49 INFO TaskSetManager: Starting task 0.0 in stage 6.0 (TID 11, localhost, partition 0,NODE_LOCAL, 2071 bytes)
- 16/09/20 09:23:49 INFO TaskSetManager: Starting task 1.0 in stage 6.0 (TID 12, localhost, partition 1,NODE_LOCAL, 2071 bytes)
- 16/09/20 09:23:49 INFO TaskSetManager: Starting task 2.0 in stage 6.0 (TID 13, localhost, partition 2,NODE_LOCAL, 2071 bytes)
- 16/09/20 09:23:49 INFO Executor: Running task 2.0 in stage 6.0 (TID 13)
- 16/09/20 09:23:49 INFO Executor: Running task 1.0 in stage 6.0 (TID 12)
- 16/09/20 09:23:49 INFO Executor: Running task 0.0 in stage 6.0 (TID 11)
- 16/09/20 09:23:49 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty blocks out of 3 blocks
- 16/09/20 09:23:49 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty blocks out of 3 blocks
- 16/09/20 09:23:49 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
- 16/09/20 09:23:49 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty blocks out of 3 blocks
- 16/09/20 09:23:49 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
- 16/09/20 09:23:49 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
- 16/09/20 09:23:50 INFO BlockManagerInfo: Removed broadcast_3_piece0 on localhost:33326 in memory (size: 1789.0 B, free: 1105.9 MB)
- 16/09/20 09:23:50 INFO ContextCleaner: Cleaned accumulator 4
- 16/09/20 09:23:50 INFO BlockManagerInfo: Removed broadcast_2_piece0 on localhost:33326 in memory (size: 1840.0 B, free: 1105.9 MB)
- 16/09/20 09:23:54 INFO ContextCleaner: Cleaned accumulator 3
- 16/09/20 09:23:54 INFO BlockManagerInfo: Removed broadcast_1_piece0 on localhost:33326 in memory (size: 1580.0 B, free: 1105.9 MB)
- 16/09/20 09:23:54 INFO ContextCleaner: Cleaned accumulator 2
- 16/09/20 09:24:06 INFO ExternalSorter: Thread 66 spilling in-memory map of 373.6 MB to disk (1 time so far)
- 16/09/20 09:24:09 INFO ExternalSorter: Thread 65 spilling in-memory map of 373.6 MB to disk (1 time so far)
- 16/09/20 09:24:09 INFO ExternalSorter: Thread 64 spilling in-memory map of 373.6 MB to disk (1 time so far)
- 16/09/20 09:24:17 ERROR Executor: Exception in task 2.0 in stage 6.0 (TID 13)
- scala.MatchError: ((4c5b5fc8-6eb9-40b1-8e6c-a81e4c9869ce07fe38fc-abf2-43b0-b618-ff898cffbad6,1.0),null) (of class scala.Tuple2)
- at mainBugTest$$anonfun$mainBugTest$$mergeMaps$1$1.apply(mainBugTest.scala:41)
- at mainBugTest$$anonfun$mainBugTest$$mergeMaps$1$1.apply(mainBugTest.scala:41)
- at scala.collection.immutable.HashMap$$anon$2.apply(HashMap.scala:148)
- at scala.collection.immutable.HashMap$HashMap1.updated0(HashMap.scala:200)
- at scala.collection.immutable.HashMap$HashTrieMap.updated0(HashMap.scala:322)
- at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:463)
- at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:488)
- at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:488)
- at scala.collection.immutable.HashMap.merged(HashMap.scala:117)
- at mainBugTest$.mainBugTest$$mergeMaps$1(mainBugTest.scala:41)
- at mainBugTest$$anonfun$7.apply(mainBugTest.scala:62)
- at mainBugTest$$anonfun$7.apply(mainBugTest.scala:62)
- at org.apache.spark.util.collection.ExternalSorter$$anon$3.next(ExternalSorter.scala:416)
- at org.apache.spark.util.collection.ExternalSorter$$anon$3.next(ExternalSorter.scala:391)
- at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
- at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
- at org.apache.spark.util.collection.ExternalSorter$$anonfun$writePartitionedFile$2.apply(ExternalSorter.scala:668)
- at org.apache.spark.util.collection.ExternalSorter$$anonfun$writePartitionedFile$2.apply(ExternalSorter.scala:667)
- at scala.collection.Iterator$class.foreach(Iterator.scala:893)
- at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
- at org.apache.spark.util.collection.ExternalSorter.writePartitionedFile(ExternalSorter.scala:667)
- at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:72)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
- at org.apache.spark.scheduler.Task.run(Task.scala:89)
- at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
- at java.lang.Thread.run(Thread.java:745)
- 16/09/20 09:24:17 INFO TaskSetManager: Starting task 3.0 in stage 6.0 (TID 14, localhost, partition 3,NODE_LOCAL, 2071 bytes)
- 16/09/20 09:24:17 WARN TaskSetManager: Lost task 2.0 in stage 6.0 (TID 13, localhost): scala.MatchError: ((4c5b5fc8-6eb9-40b1-8e6c-a81e4c9869ce07fe38fc-abf2-43b0-b618-ff898cffbad6,1.0),null) (of class scala.Tuple2)
- at mainBugTest$$anonfun$mainBugTest$$mergeMaps$1$1.apply(mainBugTest.scala:41)
- at mainBugTest$$anonfun$mainBugTest$$mergeMaps$1$1.apply(mainBugTest.scala:41)
- at scala.collection.immutable.HashMap$$anon$2.apply(HashMap.scala:148)
- at scala.collection.immutable.HashMap$HashMap1.updated0(HashMap.scala:200)
- at scala.collection.immutable.HashMap$HashTrieMap.updated0(HashMap.scala:322)
- at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:463)
- at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:488)
- at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:488)
- at scala.collection.immutable.HashMap.merged(HashMap.scala:117)
- at mainBugTest$.mainBugTest$$mergeMaps$1(mainBugTest.scala:41)
- at mainBugTest$$anonfun$7.apply(mainBugTest.scala:62)
- at mainBugTest$$anonfun$7.apply(mainBugTest.scala:62)
- at org.apache.spark.util.collection.ExternalSorter$$anon$3.next(ExternalSorter.scala:416)
- at org.apache.spark.util.collection.ExternalSorter$$anon$3.next(ExternalSorter.scala:391)
- at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
- at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
- at org.apache.spark.util.collection.ExternalSorter$$anonfun$writePartitionedFile$2.apply(ExternalSorter.scala:668)
- at org.apache.spark.util.collection.ExternalSorter$$anonfun$writePartitionedFile$2.apply(ExternalSorter.scala:667)
- at scala.collection.Iterator$class.foreach(Iterator.scala:893)
- at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
- at org.apache.spark.util.collection.ExternalSorter.writePartitionedFile(ExternalSorter.scala:667)
- at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:72)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
- at org.apache.spark.scheduler.Task.run(Task.scala:89)
- at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
- at java.lang.Thread.run(Thread.java:745)
- 16/09/20 09:24:17 INFO Executor: Running task 3.0 in stage 6.0 (TID 14)
- 16/09/20 09:24:17 ERROR TaskSetManager: Task 2 in stage 6.0 failed 1 times; aborting job
- 16/09/20 09:24:17 INFO TaskSchedulerImpl: Cancelling stage 6
- 16/09/20 09:24:17 INFO Executor: Executor is trying to kill task 1.0 in stage 6.0 (TID 12)
- 16/09/20 09:24:17 INFO Executor: Executor is trying to kill task 3.0 in stage 6.0 (TID 14)
- 16/09/20 09:24:17 INFO TaskSchedulerImpl: Stage 6 was cancelled
- 16/09/20 09:24:17 INFO Executor: Executor is trying to kill task 0.0 in stage 6.0 (TID 11)
- 16/09/20 09:24:17 INFO DAGScheduler: ShuffleMapStage 6 (repartition at mainBugTest.scala:54) failed in 27.341 s
- 16/09/20 09:24:17 INFO DAGScheduler: Job 3 failed: take at mainBugTest.scala:65, took 27.360445 s
- Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 2 in stage 6.0 failed 1 times, most recent failure: Lost task 2.0 in stage 6.0 (TID 13, localhost): scala.MatchError: ((4c5b5fc8-6eb9-40b1-8e6c-a81e4c9869ce07fe38fc-abf2-43b0-b618-ff898cffbad6,1.0),null) (of class scala.Tuple2)
- at mainBugTest$$anonfun$mainBugTest$$mergeMaps$1$1.apply(mainBugTest.scala:41)
- at mainBugTest$$anonfun$mainBugTest$$mergeMaps$1$1.apply(mainBugTest.scala:41)
- at scala.collection.immutable.HashMap$$anon$2.apply(HashMap.scala:148)
- at scala.collection.immutable.HashMap$HashMap1.updated0(HashMap.scala:200)
- at scala.collection.immutable.HashMap$HashTrieMap.updated0(HashMap.scala:322)
- at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:463)
- at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:488)
- at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:488)
- at scala.collection.immutable.HashMap.merged(HashMap.scala:117)
- at mainBugTest$.mainBugTest$$mergeMaps$1(mainBugTest.scala:41)
- at mainBugTest$$anonfun$7.apply(mainBugTest.scala:62)
- at mainBugTest$$anonfun$7.apply(mainBugTest.scala:62)
- at org.apache.spark.util.collection.ExternalSorter$$anon$3.next(ExternalSorter.scala:416)
- at org.apache.spark.util.collection.ExternalSorter$$anon$3.next(ExternalSorter.scala:391)
- at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
- at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
- at org.apache.spark.util.collection.ExternalSorter$$anonfun$writePartitionedFile$2.apply(ExternalSorter.scala:668)
- at org.apache.spark.util.collection.ExternalSorter$$anonfun$writePartitionedFile$2.apply(ExternalSorter.scala:667)
- at scala.collection.Iterator$class.foreach(Iterator.scala:893)
- at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
- at org.apache.spark.util.collection.ExternalSorter.writePartitionedFile(ExternalSorter.scala:667)
- at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:72)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
- at org.apache.spark.scheduler.Task.run(Task.scala:89)
- at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
- at java.lang.Thread.run(Thread.java:745)
- Driver stacktrace:
- at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)
- at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419)
- at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1418)
- at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
- at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
- at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1418)
- at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
- at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
- at scala.Option.foreach(Option.scala:257)
- at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:799)
- at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1640)
- at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1599)
- at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1588)
- at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
- at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)
- at org.apache.spark.SparkContext.runJob(SparkContext.scala:1832)
- at org.apache.spark.SparkContext.runJob(SparkContext.scala:1845)
- at org.apache.spark.SparkContext.runJob(SparkContext.scala:1858)
- at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1328)
- at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
- at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
- at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
- at org.apache.spark.rdd.RDD.take(RDD.scala:1302)
- at mainBugTest$.main(mainBugTest.scala:65)
- at mainBugTest.main(mainBugTest.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:606)
- at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
- Caused by: scala.MatchError: ((4c5b5fc8-6eb9-40b1-8e6c-a81e4c9869ce07fe38fc-abf2-43b0-b618-ff898cffbad6,1.0),null) (of class scala.Tuple2)
- at mainBugTest$$anonfun$mainBugTest$$mergeMaps$1$1.apply(mainBugTest.scala:41)
- at mainBugTest$$anonfun$mainBugTest$$mergeMaps$1$1.apply(mainBugTest.scala:41)
- at scala.collection.immutable.HashMap$$anon$2.apply(HashMap.scala:148)
- at scala.collection.immutable.HashMap$HashMap1.updated0(HashMap.scala:200)
- at scala.collection.immutable.HashMap$HashTrieMap.updated0(HashMap.scala:322)
- at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:463)
- at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:488)
- at scala.collection.immutable.HashMap$HashTrieMap.merge0(HashMap.scala:488)
- at scala.collection.immutable.HashMap.merged(HashMap.scala:117)
- at mainBugTest$.mainBugTest$$mergeMaps$1(mainBugTest.scala:41)
- at mainBugTest$$anonfun$7.apply(mainBugTest.scala:62)
- at mainBugTest$$anonfun$7.apply(mainBugTest.scala:62)
- at org.apache.spark.util.collection.ExternalSorter$$anon$3.next(ExternalSorter.scala:416)
- at org.apache.spark.util.collection.ExternalSorter$$anon$3.next(ExternalSorter.scala:391)
- at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
- at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
- at org.apache.spark.util.collection.ExternalSorter$$anonfun$writePartitionedFile$2.apply(ExternalSorter.scala:668)
- at org.apache.spark.util.collection.ExternalSorter$$anonfun$writePartitionedFile$2.apply(ExternalSorter.scala:667)
- at scala.collection.Iterator$class.foreach(Iterator.scala:893)
- at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
- at org.apache.spark.util.collection.ExternalSorter.writePartitionedFile(ExternalSorter.scala:667)
- at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:72)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
- at org.apache.spark.scheduler.Task.run(Task.scala:89)
- at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
- at java.lang.Thread.run(Thread.java:745)
- 16/09/20 09:24:17 INFO Executor: Executor killed task 0.0 in stage 6.0 (TID 11)
- 16/09/20 09:24:17 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty blocks out of 3 blocks
- 16/09/20 09:24:17 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 5 ms
- 16/09/20 09:24:17 INFO SparkContext: Invoking stop() from shutdown hook
- 16/09/20 09:24:17 WARN TaskSetManager: Lost task 0.0 in stage 6.0 (TID 11, localhost): TaskKilled (killed intentionally)
- 16/09/20 09:24:17 INFO Executor: Executor killed task 1.0 in stage 6.0 (TID 12)
- 16/09/20 09:24:17 WARN TaskSetManager: Lost task 1.0 in stage 6.0 (TID 12, localhost): TaskKilled (killed intentionally)
- 16/09/20 09:24:17 INFO Executor: Executor killed task 3.0 in stage 6.0 (TID 14)
- 16/09/20 09:24:17 WARN TaskSetManager: Lost task 3.0 in stage 6.0 (TID 14, localhost): TaskKilled (killed intentionally)
- 16/09/20 09:24:17 INFO TaskSchedulerImpl: Removed TaskSet 6.0, whose tasks have all completed, from pool
- 16/09/20 09:24:17 INFO SparkUI: Stopped Spark web UI at http://10.0.2.15:4040
- 16/09/20 09:24:17 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
- 16/09/20 09:24:17 INFO MemoryStore: MemoryStore cleared
- 16/09/20 09:24:17 INFO BlockManager: BlockManager stopped
- 16/09/20 09:24:17 INFO BlockManagerMaster: BlockManagerMaster stopped
- 16/09/20 09:24:17 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
- 16/09/20 09:24:17 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
- 16/09/20 09:24:17 INFO SparkContext: Successfully stopped SparkContext
- 16/09/20 09:24:17 INFO ShutdownHookManager: Shutdown hook called
- 16/09/20 09:24:17 INFO ShutdownHookManager: Deleting directory /tmp/spark-70165989-f68f-428b-9c74-28608e2dc630
Add Comment
Please, Sign In to add comment