Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- log4j:WARN No appenders could be found for logger (com.google.cloud.dataflow.sdk.options.PipelineOptionsFactory).
- log4j:WARN Please initialize the log4j system properly.
- log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
- Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
- 16/03/14 13:13:59 INFO SparkContext: Running Spark version 1.6.1
- 16/03/14 13:14:00 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
- 16/03/14 13:14:00 INFO SecurityManager: Changing view acls to: snakanda
- 16/03/14 13:14:00 INFO SecurityManager: Changing modify acls to: snakanda
- 16/03/14 13:14:00 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(snakanda); users with modify permissions: Set(snakanda)
- 16/03/14 13:14:00 INFO Utils: Successfully started service 'sparkDriver' on port 50003.
- 16/03/14 13:14:00 INFO Slf4jLogger: Slf4jLogger started
- 16/03/14 13:14:00 INFO Remoting: Starting remoting
- 16/03/14 13:14:01 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:50004]
- 16/03/14 13:14:01 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 50004.
- 16/03/14 13:14:01 INFO SparkEnv: Registering MapOutputTracker
- 16/03/14 13:14:01 INFO SparkEnv: Registering BlockManagerMaster
- 16/03/14 13:14:01 INFO DiskBlockManager: Created local directory at /private/var/folders/r8/3x9692kj7679zdbxtwly35qr0000gn/T/blockmgr-018f8f99-82e1-4ad5-ade5-6145affb43ce
- 16/03/14 13:14:01 INFO MemoryStore: MemoryStore started with capacity 1140.4 MB
- 16/03/14 13:14:01 INFO SparkEnv: Registering OutputCommitCoordinator
- 16/03/14 13:14:01 INFO Utils: Successfully started service 'SparkUI' on port 4040.
- 16/03/14 13:14:01 INFO SparkUI: Started SparkUI at http://156.56.179.144:4040
- 16/03/14 13:14:01 INFO Executor: Starting executor ID driver on host localhost
- 16/03/14 13:14:01 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 50005.
- 16/03/14 13:14:01 INFO NettyBlockTransferService: Server created on 50005
- 16/03/14 13:14:01 INFO BlockManagerMaster: Trying to register BlockManager
- 16/03/14 13:14:01 INFO BlockManagerMasterEndpoint: Registering block manager localhost:50005 with 1140.4 MB RAM, BlockManagerId(driver, localhost, 50005)
- 16/03/14 13:14:01 INFO BlockManagerMaster: Registered BlockManager
- 16/03/14 13:14:01 INFO SparkPipelineRunner: Evaluating TextIO.Read
- 16/03/14 13:14:01 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 119.9 KB, free 119.9 KB)
- 16/03/14 13:14:02 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 12.8 KB, free 132.7 KB)
- 16/03/14 13:14:02 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:50005 (size: 12.8 KB, free: 1140.4 MB)
- 16/03/14 13:14:02 INFO SparkContext: Created broadcast 0 from textFile at TransformTranslator.java:393
- 16/03/14 13:14:02 INFO SparkPipelineRunner: Evaluating AnonymousParDo
- 16/03/14 13:14:02 INFO SparkPipelineRunner: Evaluating Window.Into()
- 16/03/14 13:14:02 INFO SparkPipelineRunner: Evaluating ParDo(ReifyTimestampAndWindows)
- 16/03/14 13:14:02 INFO SparkPipelineRunner: Evaluating GroupByKey.GroupByKeyOnly
- 16/03/14 13:14:02 INFO FileInputFormat: Total input paths to process : 1
- 16/03/14 13:14:02 INFO SparkPipelineRunner: Evaluating AnonymousParDo
- 16/03/14 13:14:02 INFO SparkPipelineRunner: Evaluating ParDo(GroupAlsoByWindowsViaIterators)
- 16/03/14 13:14:02 INFO SparkContext: Starting job: count at EvaluationContext.java:201
- 16/03/14 13:14:02 INFO DAGScheduler: Registering RDD 5 (mapToPair at TransformTranslator.java:132)
- 16/03/14 13:14:02 INFO DAGScheduler: Got job 0 (count at EvaluationContext.java:201) with 2 output partitions
- 16/03/14 13:14:02 INFO DAGScheduler: Final stage: ResultStage 1 (count at EvaluationContext.java:201)
- 16/03/14 13:14:02 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)
- 16/03/14 13:14:02 INFO DAGScheduler: Missing parents: List(ShuffleMapStage 0)
- 16/03/14 13:14:02 INFO DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[5] at mapToPair at TransformTranslator.java:132), which has no missing parents
- 16/03/14 13:14:02 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 224.8 KB, free 357.5 KB)
- 16/03/14 13:14:02 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 118.5 KB, free 476.0 KB)
- 16/03/14 13:14:02 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on localhost:50005 (size: 118.5 KB, free: 1140.2 MB)
- 16/03/14 13:14:02 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1006
- 16/03/14 13:14:02 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[5] at mapToPair at TransformTranslator.java:132)
- 16/03/14 13:14:02 INFO TaskSchedulerImpl: Adding task set 0.0 with 2 tasks
- 16/03/14 13:14:02 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, partition 0,PROCESS_LOCAL, 2148 bytes)
- 16/03/14 13:14:02 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
- 16/03/14 13:14:02 INFO HadoopRDD: Input split: file:/Users/supun/Work/stock-analysis/inputFiles/2004.csv:0+33554432
- 16/03/14 13:14:02 INFO deprecation: mapred.tip.id is deprecated. Instead, use mapreduce.task.id
- 16/03/14 13:14:02 INFO deprecation: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id
- 16/03/14 13:14:02 INFO deprecation: mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap
- 16/03/14 13:14:02 INFO deprecation: mapred.task.partition is deprecated. Instead, use mapreduce.task.partition
- 16/03/14 13:14:02 INFO deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id
- 16/03/14 13:14:02 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
- java.lang.ClassCastException: com.google.cloud.dataflow.sdk.transforms.windowing.GlobalWindow cannot be cast to com.google.cloud.dataflow.sdk.transforms.windowing.IntervalWindow
- at com.google.cloud.dataflow.sdk.transforms.windowing.IntervalWindow$IntervalWindowCoder.encode(IntervalWindow.java:171)
- at com.google.cloud.dataflow.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:113)
- at com.google.cloud.dataflow.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:59)
- at com.google.cloud.dataflow.sdk.util.WindowedValue$FullWindowedValueCoder.encode(WindowedValue.java:599)
- at com.google.cloud.dataflow.sdk.util.WindowedValue$FullWindowedValueCoder.encode(WindowedValue.java:540)
- at com.cloudera.dataflow.spark.CoderHelpers.toByteArray(CoderHelpers.java:48)
- at com.cloudera.dataflow.spark.CoderHelpers$3.call(CoderHelpers.java:134)
- at com.cloudera.dataflow.spark.CoderHelpers$3.call(CoderHelpers.java:131)
- at org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1.apply(JavaPairRDD.scala:1018)
- at org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1.apply(JavaPairRDD.scala:1018)
- at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
- at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:149)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
- at org.apache.spark.scheduler.Task.run(Task.scala:89)
- at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
- at java.lang.Thread.run(Thread.java:745)
- 16/03/14 13:14:02 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, localhost, partition 1,PROCESS_LOCAL, 2148 bytes)
- 16/03/14 13:14:02 INFO Executor: Running task 1.0 in stage 0.0 (TID 1)
- 16/03/14 13:14:02 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, localhost): java.lang.ClassCastException: com.google.cloud.dataflow.sdk.transforms.windowing.GlobalWindow cannot be cast to com.google.cloud.dataflow.sdk.transforms.windowing.IntervalWindow
- at com.google.cloud.dataflow.sdk.transforms.windowing.IntervalWindow$IntervalWindowCoder.encode(IntervalWindow.java:171)
- at com.google.cloud.dataflow.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:113)
- at com.google.cloud.dataflow.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:59)
- at com.google.cloud.dataflow.sdk.util.WindowedValue$FullWindowedValueCoder.encode(WindowedValue.java:599)
- at com.google.cloud.dataflow.sdk.util.WindowedValue$FullWindowedValueCoder.encode(WindowedValue.java:540)
- at com.cloudera.dataflow.spark.CoderHelpers.toByteArray(CoderHelpers.java:48)
- at com.cloudera.dataflow.spark.CoderHelpers$3.call(CoderHelpers.java:134)
- at com.cloudera.dataflow.spark.CoderHelpers$3.call(CoderHelpers.java:131)
- at org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1.apply(JavaPairRDD.scala:1018)
- at org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1.apply(JavaPairRDD.scala:1018)
- at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
- at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:149)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
- at org.apache.spark.scheduler.Task.run(Task.scala:89)
- at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
- at java.lang.Thread.run(Thread.java:745)
- 16/03/14 13:14:02 ERROR TaskSetManager: Task 0 in stage 0.0 failed 1 times; aborting job
- 16/03/14 13:14:02 INFO HadoopRDD: Input split: file:/Users/supun/Work/stock-analysis/inputFiles/2004.csv:33554432+27825150
- 16/03/14 13:14:02 INFO TaskSchedulerImpl: Cancelling stage 0
- 16/03/14 13:14:02 INFO Executor: Executor is trying to kill task 1.0 in stage 0.0 (TID 1)
- 16/03/14 13:14:02 INFO TaskSchedulerImpl: Stage 0 was cancelled
- 16/03/14 13:14:02 INFO DAGScheduler: ShuffleMapStage 0 (mapToPair at TransformTranslator.java:132) failed in 0.179 s
- 16/03/14 13:14:02 INFO DAGScheduler: Job 0 failed: count at EvaluationContext.java:201, took 0.275541 s
- Exception in thread "main" java.lang.RuntimeException: java.lang.ClassCastException: com.google.cloud.dataflow.sdk.transforms.windowing.GlobalWindow cannot be cast to com.google.cloud.dataflow.sdk.transforms.windowing.IntervalWindow
- at com.cloudera.dataflow.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:121)
- at edu.indiana.soic.ts.streaming.dataflow.StockAnalysisPipeline2.main(StockAnalysisPipeline2.java:120)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:497)
- at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
- Caused by: java.lang.ClassCastException: com.google.cloud.dataflow.sdk.transforms.windowing.GlobalWindow cannot be cast to com.google.cloud.dataflow.sdk.transforms.windowing.IntervalWindow
- at com.google.cloud.dataflow.sdk.transforms.windowing.IntervalWindow$IntervalWindowCoder.encode(IntervalWindow.java:171)
- at com.google.cloud.dataflow.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:113)
- at com.google.cloud.dataflow.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:59)
- at com.google.cloud.dataflow.sdk.util.WindowedValue$FullWindowedValueCoder.encode(WindowedValue.java:599)
- at com.google.cloud.dataflow.sdk.util.WindowedValue$FullWindowedValueCoder.encode(WindowedValue.java:540)
- at com.cloudera.dataflow.spark.CoderHelpers.toByteArray(CoderHelpers.java:48)
- at com.cloudera.dataflow.spark.CoderHelpers$3.call(CoderHelpers.java:134)
- at com.cloudera.dataflow.spark.CoderHelpers$3.call(CoderHelpers.java:131)
- at org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1.apply(JavaPairRDD.scala:1018)
- at org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1.apply(JavaPairRDD.scala:1018)
- at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
- at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:149)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
- at org.apache.spark.scheduler.Task.run(Task.scala:89)
- at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
- at java.lang.Thread.run(Thread.java:745)
- 16/03/14 13:14:02 INFO SparkContext: Invoking stop() from shutdown hook
- 16/03/14 13:14:02 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1)
- java.lang.ClassCastException: com.google.cloud.dataflow.sdk.transforms.windowing.GlobalWindow cannot be cast to com.google.cloud.dataflow.sdk.transforms.windowing.IntervalWindow
- at com.google.cloud.dataflow.sdk.transforms.windowing.IntervalWindow$IntervalWindowCoder.encode(IntervalWindow.java:171)
- at com.google.cloud.dataflow.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:113)
- at com.google.cloud.dataflow.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:59)
- at com.google.cloud.dataflow.sdk.util.WindowedValue$FullWindowedValueCoder.encode(WindowedValue.java:599)
- at com.google.cloud.dataflow.sdk.util.WindowedValue$FullWindowedValueCoder.encode(WindowedValue.java:540)
- at com.cloudera.dataflow.spark.CoderHelpers.toByteArray(CoderHelpers.java:48)
- at com.cloudera.dataflow.spark.CoderHelpers$3.call(CoderHelpers.java:134)
- at com.cloudera.dataflow.spark.CoderHelpers$3.call(CoderHelpers.java:131)
- at org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1.apply(JavaPairRDD.scala:1018)
- at org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1.apply(JavaPairRDD.scala:1018)
- at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
- at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:149)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
- at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
- at org.apache.spark.scheduler.Task.run(Task.scala:89)
- at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
- at java.lang.Thread.run(Thread.java:745)
- 16/03/14 13:14:02 INFO TaskSetManager: Lost task 1.0 in stage 0.0 (TID 1) on executor localhost: java.lang.ClassCastException (com.google.cloud.dataflow.sdk.transforms.windowing.GlobalWindow cannot be cast to com.google.cloud.dataflow.sdk.transforms.windowing.IntervalWindow) [duplicate 1]
- 16/03/14 13:14:02 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
- 16/03/14 13:14:02 INFO SparkUI: Stopped Spark web UI at http://156.56.179.144:4040
- 16/03/14 13:14:02 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
- 16/03/14 13:14:02 INFO MemoryStore: MemoryStore cleared
- 16/03/14 13:14:02 INFO BlockManager: BlockManager stopped
- 16/03/14 13:14:02 INFO BlockManagerMaster: BlockManagerMaster stopped
- 16/03/14 13:14:02 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
- 16/03/14 13:14:02 INFO SparkContext: Successfully stopped SparkContext
- 16/03/14 13:14:02 INFO ShutdownHookManager: Shutdown hook called
- 16/03/14 13:14:02 INFO ShutdownHookManager: Deleting directory /private/var/folders/r8/3x9692kj7679zdbxtwly35qr0000gn/T/spark-208493ac-43af-40ac-ae37-978bf1450f8a
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement