Advertisement
aironman

sbt run

Apr 3rd, 2019
91
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
  1. sbt run
  2. [info] Loading global plugins from /Users/aironman/.sbt/1.0/plugins
  3. [info] Loading project definition from /Users/aironman/IdeaProjects/Chapter9/project
  4. [info] Loading settings for project chapter9 from build.sbt ...
  5. [info] Set current project to SparkJobs (in build file:/Users/aironman/IdeaProjects/Chapter9/)
  6. [info] Running chapter9.KafkaAndSparkStreaming
  7. WARNING: An illegal reflective access operation has occurred
  8. WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/var/folders/gn/pzkybyfd2g5bpyh47q0pp5nc0000gn/T/sbt_c8ef36e1/target/94023850/spark-unsafe_2.11-2.4.1.jar) to method java.nio.Bits.unaligned()
  9. WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
  10. WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
  11. WARNING: All illegal access operations will be denied in a future release
  12. Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
  13. 19/04/03 17:55:56 INFO SparkContext: Running Spark version 2.4.1
  14. 19/04/03 17:55:56 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  15. 19/04/03 17:55:56 INFO SparkContext: Submitted application: KafkaAndSparkStreaming
  16. 19/04/03 17:55:56 INFO SecurityManager: Changing view acls to: aironman
  17. 19/04/03 17:55:56 INFO SecurityManager: Changing modify acls to: aironman
  18. 19/04/03 17:55:56 INFO SecurityManager: Changing view acls groups to:
  19. 19/04/03 17:55:56 INFO SecurityManager: Changing modify acls groups to:
  20. 19/04/03 17:55:56 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(aironman); groups with view permissions: Set(); users with modify permissions: Set(aironman); groups with modify permissions: Set()
  21. 19/04/03 17:55:57 INFO Utils: Successfully started service 'sparkDriver' on port 54536.
  22. 19/04/03 17:55:57 INFO SparkEnv: Registering MapOutputTracker
  23. 19/04/03 17:55:57 INFO SparkEnv: Registering BlockManagerMaster
  24. 19/04/03 17:55:57 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
  25. 19/04/03 17:55:57 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
  26. 19/04/03 17:55:57 INFO DiskBlockManager: Created local directory at /private/var/folders/gn/pzkybyfd2g5bpyh47q0pp5nc0000gn/T/blockmgr-ad6d54f0-7cba-4d4a-b067-99ff9964c907
  27. 19/04/03 17:55:57 INFO MemoryStore: MemoryStore started with capacity 588.6 GB
  28. 19/04/03 17:55:57 INFO SparkEnv: Registering OutputCommitCoordinator
  29. 19/04/03 17:55:57 INFO Utils: Successfully started service 'SparkUI' on port 4040.
  30. 19/04/03 17:55:57 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.99.1:4040
  31. 19/04/03 17:55:57 INFO Executor: Starting executor ID driver on host localhost
  32. 19/04/03 17:55:57 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 54537.
  33. 19/04/03 17:55:57 INFO NettyBlockTransferService: Server created on 192.168.99.1:54537
  34. 19/04/03 17:55:57 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
  35. 19/04/03 17:55:57 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.99.1, 54537, None)
  36. 19/04/03 17:55:57 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.99.1:54537 with 588.6 GB RAM, BlockManagerId(driver, 192.168.99.1, 54537, None)
  37. 19/04/03 17:55:57 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.99.1, 54537, None)
  38. 19/04/03 17:55:57 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.99.1, 54537, None)
  39. [error] (run-main-0) java.lang.ExceptionInInitializerError
  40. [error] java.lang.ExceptionInInitializerError
  41. [error] at org.apache.spark.streaming.dstream.InputDStream.<init>(InputDStream.scala:80)
  42. [error] at org.apache.spark.streaming.kafka010.DirectKafkaInputDStream.<init>(DirectKafkaInputDStream.scala:57)
  43. [error] at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:147)
  44. [error] at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:124)
  45. [error] at chapter9.KafkaAndSparkStreaming$.main(KafkaAndSparkStreaming.scala:38)
  46. [error] at chapter9.KafkaAndSparkStreaming.main(KafkaAndSparkStreaming.scala)
  47. [error] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  48. [error] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  49. [error] at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  50. [error] at java.base/java.lang.reflect.Method.invoke(Method.java:566)
  51. [error] Caused by: com.fasterxml.jackson.databind.JsonMappingException: Incompatible Jackson version: 2.9.8
  52. [error] at com.fasterxml.jackson.module.scala.JacksonModule$class.setupModule(JacksonModule.scala:64)
  53. [error] at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:19)
  54. [error] at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:751)
  55. [error] at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)
  56. [error] at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)
  57. [error] at org.apache.spark.streaming.dstream.InputDStream.<init>(InputDStream.scala:80)
  58. [error] at org.apache.spark.streaming.kafka010.DirectKafkaInputDStream.<init>(DirectKafkaInputDStream.scala:57)
  59. [error] at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:147)
  60. [error] at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:124)
  61. [error] at chapter9.KafkaAndSparkStreaming$.main(KafkaAndSparkStreaming.scala:38)
  62. [error] at chapter9.KafkaAndSparkStreaming.main(KafkaAndSparkStreaming.scala)
  63. [error] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  64. [error] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  65. [error] at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  66. [error] at java.base/java.lang.reflect.Method.invoke(Method.java:566)
  67. 19/04/03 17:55:57 ERROR Utils: uncaught error in thread spark-listener-group-appStatus, stopping SparkContext
  68. java.lang.InterruptedException
  69. at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
  70. at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2090)
  71. at java.base/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:433)
  72. at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply$mcJ$sp(AsyncEventQueue.scala:97)
  73. at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
  74. at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
  75. at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
  76. at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:87)
  77. at org.apache.spark.scheduler.AsyncEventQueue$$anon$1$$anonfun$run$1.apply$mcV$sp(AsyncEventQueue.scala:83)
  78. at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1302)
  79. at org.apache.spark.scheduler.AsyncEventQueue$$anon$1.run(AsyncEventQueue.scala:82)
  80. 19/04/03 17:55:57 ERROR Utils: uncaught error in thread spark-listener-group-executorManagement, stopping SparkContext
  81. java.lang.InterruptedException
  82. at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
  83. at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2090)
  84. at java.base/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:433)
  85. at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply$mcJ$sp(AsyncEventQueue.scala:97)
  86. at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
  87. at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
  88. at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
  89. at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:87)
  90. at org.apache.spark.scheduler.AsyncEventQueue$$anon$1$$anonfun$run$1.apply$mcV$sp(AsyncEventQueue.scala:83)
  91. at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1302)
  92. at org.apache.spark.scheduler.AsyncEventQueue$$anon$1.run(AsyncEventQueue.scala:82)
  93. 19/04/03 17:55:57 ERROR Utils: uncaught error in thread spark-listener-group-shared, stopping SparkContext
  94. java.lang.InterruptedException
  95. at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
  96. at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2090)
  97. at java.base/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:433)
  98. at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply$mcJ$sp(AsyncEventQueue.scala:88)
  99. at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
  100. at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
  101. at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
  102. at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:87)
  103. at org.apache.spark.scheduler.AsyncEventQueue$$anon$1$$anonfun$run$1.apply$mcV$sp(AsyncEventQueue.scala:83)
  104. at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1302)
  105. at org.apache.spark.scheduler.AsyncEventQueue$$anon$1.run(AsyncEventQueue.scala:82)
  106. 19/04/03 17:55:57 ERROR Utils: throw uncaught fatal error in thread spark-listener-group-appStatus
  107. java.lang.InterruptedException
  108. at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
  109. at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2090)
  110. at java.base/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:433)
  111. at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply$mcJ$sp(AsyncEventQueue.scala:97)
  112. at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
  113. at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
  114. at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
  115. at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:87)
  116. at org.apache.spark.scheduler.AsyncEventQueue$$anon$1$$anonfun$run$1.apply$mcV$sp(AsyncEventQueue.scala:83)
  117. at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1302)
  118. at org.apache.spark.scheduler.AsyncEventQueue$$anon$1.run(AsyncEventQueue.scala:82)
  119. 19/04/03 17:55:57 ERROR ContextCleaner: Error in cleaning thread
  120. java.lang.InterruptedException
  121. at java.base/java.lang.Object.wait(Native Method)
  122. at java.base/java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:155)
  123. at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply$mcV$sp(ContextCleaner.scala:181)
  124. at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1302)
  125. at org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:178)
  126. at org.apache.spark.ContextCleaner$$anon$1.run(ContextCleaner.scala:73)
  127. 19/04/03 17:55:57 ERROR Utils: throw uncaught fatal error in thread spark-listener-group-executorManagement
  128. java.lang.InterruptedException
  129. at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
  130. at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2090)
  131. at java.base/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:433)
  132. at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply$mcJ$sp(AsyncEventQueue.scala:97)
  133. at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
  134. at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
  135. at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
  136. at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:87)
  137. at org.apache.spark.scheduler.AsyncEventQueue$$anon$1$$anonfun$run$1.apply$mcV$sp(AsyncEventQueue.scala:83)
  138. at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1302)
  139. at org.apache.spark.scheduler.AsyncEventQueue$$anon$1.run(AsyncEventQueue.scala:82)
  140. 19/04/03 17:55:57 ERROR Utils: throw uncaught fatal error in thread spark-listener-group-shared
  141. java.lang.InterruptedException
  142. at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056)
  143. at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2090)
  144. at java.base/java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:433)
  145. at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply$mcJ$sp(AsyncEventQueue.scala:88)
  146. at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
  147. at org.apache.spark.scheduler.AsyncEventQueue$$anonfun$org$apache$spark$scheduler$AsyncEventQueue$$dispatch$1.apply(AsyncEventQueue.scala:87)
  148. at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
  149. at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:87)
  150. at org.apache.spark.scheduler.AsyncEventQueue$$anon$1$$anonfun$run$1.apply$mcV$sp(AsyncEventQueue.scala:83)
  151. at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1302)
  152. at org.apache.spark.scheduler.AsyncEventQueue$$anon$1.run(AsyncEventQueue.scala:82)
  153. 19/04/03 17:55:57 INFO SparkContext: SparkContext already stopped.
  154. 19/04/03 17:55:57 INFO SparkContext: SparkContext already stopped.
  155. 19/04/03 17:55:57 INFO SparkUI: Stopped Spark web UI at http://192.168.99.1:4040
  156. [error] Nonzero exit code: 1
  157. [error] (Compile / run) Nonzero exit code: 1
  158. [error] Total time: 5 s, completed 3 abr. 2019 17:55:57
  159. 19/04/03 17:55:57 INFO DiskBlockManager: Shutdown hook called
  160. 19/04/03 17:55:57 INFO ShutdownHookManager: Shutdown hook called
  161. 19/04/03 17:55:57 INFO ShutdownHookManager: Deleting directory /private/var/folders/gn/pzkybyfd2g5bpyh47q0pp5nc0000gn/T/spark-5e1730c4-aaa5-4a82-ae61-d6196ae61482/userFiles-1ba808cc-f461-47c4-ae58-eb38345bb0b6
  162. 19/04/03 17:55:57 INFO ShutdownHookManager: Deleting directory /private/var/folders/gn/pzkybyfd2g5bpyh47q0pp5nc0000gn/T/spark-5e1730c4-aaa5-4a82-ae61-d6196ae61482
Advertisement
Advertisement
Advertisement
RAW Paste Data Copied
Advertisement