Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- sbt run
- [info] Loading global plugins from /Users/aironman/.sbt/1.0/plugins
- [info] Loading project definition from /Users/aironman/IdeaProjects/Chapter9/project
- [info] Loading settings for project chapter9 from build.sbt ...
- [info] Set current project to SparkJobs (in build file:/Users/aironman/IdeaProjects/Chapter9/)
- [info] Running chapter9.KafkaAndSparkStreaming
- WARNING: An illegal reflective access operation has occurred
- WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/var/folders/gn/pzkybyfd2g5bpyh47q0pp5nc0000gn/T/sbt_c8ef36e1/target/94023850/spark-unsafe_2.11-2.4.1.jar) to method java.nio.Bits.unaligned()
- WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
- WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
- WARNING: All illegal access operations will be denied in a future release
- Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
- 19/04/03 17:55:56 INFO SparkContext: Running Spark version 2.4.1
- 19/04/03 17:55:56 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
- 19/04/03 17:55:56 INFO SparkContext: Submitted application: KafkaAndSparkStreaming
- 19/04/03 17:55:56 INFO SecurityManager: Changing view acls to: aironman
- 19/04/03 17:55:56 INFO SecurityManager: Changing modify acls to: aironman
- 19/04/03 17:55:56 INFO SecurityManager: Changing view acls groups to:
- 19/04/03 17:55:56 INFO SecurityManager: Changing modify acls groups to:
- 19/04/03 17:55:56 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(aironman); groups with view permissions: Set(); users with modify permissions: Set(aironman); groups with modify permissions: Set()
- 19/04/03 17:55:57 INFO Utils: Successfully started service 'sparkDriver' on port 54536.
- 19/04/03 17:55:57 INFO SparkEnv: Registering MapOutputTracker
- 19/04/03 17:55:57 INFO SparkEnv: Registering BlockManagerMaster
- 19/04/03 17:55:57 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
- 19/04/03 17:55:57 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
- 19/04/03 17:55:57 INFO DiskBlockManager: Created local directory at /private/var/folders/gn/pzkybyfd2g5bpyh47q0pp5nc0000gn/T/blockmgr-ad6d54f0-7cba-4d4a-b067-99ff9964c907
- 19/04/03 17:55:57 INFO MemoryStore: MemoryStore started with capacity 588.6 GB
- 19/04/03 17:55:57 INFO SparkEnv: Registering OutputCommitCoordinator
- 19/04/03 17:55:57 INFO Utils: Successfully started service 'SparkUI' on port 4040.
- 19/04/03 17:55:57 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.99.1:4040
- 19/04/03 17:55:57 INFO Executor: Starting executor ID driver on host localhost
- 19/04/03 17:55:57 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 54537.
- 19/04/03 17:55:57 INFO NettyBlockTransferService: Server created on 192.168.99.1:54537
- 19/04/03 17:55:57 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
- 19/04/03 17:55:57 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.99.1, 54537, None)
- 19/04/03 17:55:57 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.99.1:54537 with 588.6 GB RAM, BlockManagerId(driver, 192.168.99.1, 54537, None)
- 19/04/03 17:55:57 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.99.1, 54537, None)
- 19/04/03 17:55:57 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.99.1, 54537, None)
- [error] (run-main-0) java.lang.ExceptionInInitializerError
- [error] java.lang.ExceptionInInitializerError
- [error] at org.apache.spark.streaming.dstream.InputDStream.<init>(InputDStream.scala:80)
- [error] at org.apache.spark.streaming.kafka010.DirectKafkaInputDStream.<init>(DirectKafkaInputDStream.scala:57)
- [error] at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:147)
- [error] at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:124)
- [error] at chapter9.KafkaAndSparkStreaming$.main(KafkaAndSparkStreaming.scala:38)
- [error] at chapter9.KafkaAndSparkStreaming.main(KafkaAndSparkStreaming.scala)
- [error] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- [error] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- [error] at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- [error] at java.base/java.lang.reflect.Method.invoke(Method.java:566)
- [error] Caused by: com.fasterxml.jackson.databind.JsonMappingException: Incompatible Jackson version: 2.9.8
- [error] at com.fasterxml.jackson.module.scala.JacksonModule$class.setupModule(JacksonModule.scala:64)
- [error] at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:19)
- [error] at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:751)
- [error] at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)
- [error] at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)
- [error] at org.apache.spark.streaming.dstream.InputDStream.<init>(InputDStream.scala:80)
- [error] at org.apache.spark.streaming.kafka010.DirectKafkaInputDStream.<init>(DirectKafkaInputDStream.scala:57)
- [error] at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:147)
- [error] at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:124)
- [error] at chapter9.KafkaAndSparkStreaming$.main(KafkaAndSparkStreaming.scala:38)
- [error] at chapter9.KafkaAndSparkStreaming.main(KafkaAndSparkStreaming.scala)
- [error] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- [error] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- [error] at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- [error] at java.base/java.lang.reflect.Method.invoke(Method.java:566)
- 19/04/03 17:55:57 ERROR Utils: uncaught error in thread spark-listener-group-appStatus, stopping SparkContext
- java.lang.InterruptedException
- at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
- at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2090)
- at java.base/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:433)
- at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply$mcJ$sp(AsyncEventQueue.scala:97)
- at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
- at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
- at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
- at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:87)
- at org.apache.spark.scheduler.AsyncEventQueue$$anon$1$$anonfun$run$1.apply$mcV$sp(AsyncEventQueue.scala:83)
- at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1302)
- at org.apache.spark.scheduler.AsyncEventQueue$$anon$1.run(AsyncEventQueue.scala:82)
- 19/04/03 17:55:57 ERROR Utils: uncaught error in thread spark-listener-group-executorManagement, stopping SparkContext
- java.lang.InterruptedException
- at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
- at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2090)
- at java.base/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:433)
- at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply$mcJ$sp(AsyncEventQueue.scala:97)
- at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
- at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
- at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
- at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:87)
- at org.apache.spark.scheduler.AsyncEventQueue$$anon$1$$anonfun$run$1.apply$mcV$sp(AsyncEventQueue.scala:83)
- at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1302)
- at org.apache.spark.scheduler.AsyncEventQueue$$anon$1.run(AsyncEventQueue.scala:82)
- 19/04/03 17:55:57 ERROR Utils: uncaught error in thread spark-listener-group-shared, stopping SparkContext
- java.lang.InterruptedException
- at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
- at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2090)
- at java.base/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:433)
- at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply$mcJ$sp(AsyncEventQueue.scala:88)
- at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
- at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
- at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
- at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:87)
- at org.apache.spark.scheduler.AsyncEventQueue$$anon$1$$anonfun$run$1.apply$mcV$sp(AsyncEventQueue.scala:83)
- at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1302)
- at org.apache.spark.scheduler.AsyncEventQueue$$anon$1.run(AsyncEventQueue.scala:82)
- 19/04/03 17:55:57 ERROR Utils: throw uncaught fatal error in thread spark-listener-group-appStatus
- java.lang.InterruptedException
- at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
- at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2090)
- at java.base/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:433)
- at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply$mcJ$sp(AsyncEventQueue.scala:97)
- at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
- at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
- at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
- at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:87)
- at org.apache.spark.scheduler.AsyncEventQueue$$anon$1$$anonfun$run$1.apply$mcV$sp(AsyncEventQueue.scala:83)
- at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1302)
- at org.apache.spark.scheduler.AsyncEventQueue$$anon$1.run(AsyncEventQueue.scala:82)
- 19/04/03 17:55:57 ERROR ContextCleaner: Error in cleaning thread
- java.lang.InterruptedException
- at java.base/java.lang.Object.wait(Native Method)
- at java.base/java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:155)
- at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply$mcV$sp(ContextCleaner.scala:181)
- at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1302)
- at org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:178)
- at org.apache.spark.ContextCleaner$$anon$1.run(ContextCleaner.scala:73)
- 19/04/03 17:55:57 ERROR Utils: throw uncaught fatal error in thread spark-listener-group-executorManagement
- java.lang.InterruptedException
- at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
- at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2090)
- at java.base/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:433)
- at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply$mcJ$sp(AsyncEventQueue.scala:97)
- at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
- at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
- at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
- at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:87)
- at org.apache.spark.scheduler.AsyncEventQueue$$anon$1$$anonfun$run$1.apply$mcV$sp(AsyncEventQueue.scala:83)
- at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1302)
- at org.apache.spark.scheduler.AsyncEventQueue$$anon$1.run(AsyncEventQueue.scala:82)
- 19/04/03 17:55:57 ERROR Utils: throw uncaught fatal error in thread spark-listener-group-shared
- java.lang.InterruptedException
- at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
- at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2090)
- at java.base/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:433)
- at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply$mcJ$sp(AsyncEventQueue.scala:88)
- at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
- at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
- at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
- at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:87)
- at org.apache.spark.scheduler.AsyncEventQueue$$anon$1$$anonfun$run$1.apply$mcV$sp(AsyncEventQueue.scala:83)
- at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1302)
- at org.apache.spark.scheduler.AsyncEventQueue$$anon$1.run(AsyncEventQueue.scala:82)
- 19/04/03 17:55:57 INFO SparkContext: SparkContext already stopped.
- 19/04/03 17:55:57 INFO SparkContext: SparkContext already stopped.
- 19/04/03 17:55:57 INFO SparkUI: Stopped Spark web UI at http://192.168.99.1:4040
- [error] Nonzero exit code: 1
- [error] (Compile / run) Nonzero exit code: 1
- [error] Total time: 5 s, completed 3 abr. 2019 17:55:57
- 19/04/03 17:55:57 INFO DiskBlockManager: Shutdown hook called
- 19/04/03 17:55:57 INFO ShutdownHookManager: Shutdown hook called
- 19/04/03 17:55:57 INFO ShutdownHookManager: Deleting directory /private/var/folders/gn/pzkybyfd2g5bpyh47q0pp5nc0000gn/T/spark-5e1730c4-aaa5-4a82-ae61-d6196ae61482/userFiles-1ba808cc-f461-47c4-ae58-eb38345bb0b6
- 19/04/03 17:55:57 INFO ShutdownHookManager: Deleting directory /private/var/folders/gn/pzkybyfd2g5bpyh47q0pp5nc0000gn/T/spark-5e1730c4-aaa5-4a82-ae61-d6196ae61482
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement