Advertisement
Guest User

spark logs

a guest
Apr 4th, 2016
79
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 106.74 KB | None | 0 0
  1. bash# spark-shell -i test.scala --master yarn-client
  2. Ivy Default Cache set to: /root/.ivy2/cache
  3. The jars for the packages stored in: /root/.ivy2/jars
  4. :: loading settings :: url = jar:file:/opt/hosting/run/hdp/2.4.0.0-169/spark/lib/spark-assembly-1.6.0.2.4.0.0-169-hadoop2.7.1.2.4.0.0-169.jar!/org/apache/ivy/core/settings/ivysettings.xml
  5. org.apache.spark#spark-streaming-kafka_2.10 added as a dependency
  6. org.elasticsearch#elasticsearch-spark_2.10 added as a dependency
  7. org.json4s#json4s-native_2.10 added as a dependency
  8. :: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
  9. confs: [default]
  10. found org.apache.spark#spark-streaming-kafka_2.10;1.6.0 in central
  11. found org.apache.kafka#kafka_2.10;0.8.2.1 in central
  12. found com.yammer.metrics#metrics-core;2.2.0 in central
  13. found org.slf4j#slf4j-api;1.7.10 in central
  14. found org.apache.kafka#kafka-clients;0.8.2.1 in central
  15. found net.jpountz.lz4#lz4;1.3.0 in central
  16. found org.xerial.snappy#snappy-java;1.1.2 in central
  17. found com.101tec#zkclient;0.3 in central
  18. found log4j#log4j;1.2.17 in central
  19. found org.spark-project.spark#unused;1.0.0 in central
  20. found org.elasticsearch#elasticsearch-spark_2.10;2.2.0 in central
  21. found org.json4s#json4s-native_2.10;3.2.11 in central
  22. found org.json4s#json4s-core_2.10;3.2.11 in central
  23. found org.json4s#json4s-ast_2.10;3.2.11 in central
  24. found com.thoughtworks.paranamer#paranamer;2.6 in list
  25. found org.scala-lang#scalap;2.10.0 in central
  26. found org.scala-lang#scala-compiler;2.10.0 in central
  27. found org.scala-lang#scala-reflect;2.10.0 in central
  28. :: resolution report :: resolve 514ms :: artifacts dl 14ms
  29. :: modules in use:
  30. com.101tec#zkclient;0.3 from central in [default]
  31. com.thoughtworks.paranamer#paranamer;2.6 from list in [default]
  32. com.yammer.metrics#metrics-core;2.2.0 from central in [default]
  33. log4j#log4j;1.2.17 from central in [default]
  34. net.jpountz.lz4#lz4;1.3.0 from central in [default]
  35. org.apache.kafka#kafka-clients;0.8.2.1 from central in [default]
  36. org.apache.kafka#kafka_2.10;0.8.2.1 from central in [default]
  37. org.apache.spark#spark-streaming-kafka_2.10;1.6.0 from central in [default]
  38. org.elasticsearch#elasticsearch-spark_2.10;2.2.0 from central in [default]
  39. org.json4s#json4s-ast_2.10;3.2.11 from central in [default]
  40. org.json4s#json4s-core_2.10;3.2.11 from central in [default]
  41. org.json4s#json4s-native_2.10;3.2.11 from central in [default]
  42. org.scala-lang#scala-compiler;2.10.0 from central in [default]
  43. org.scala-lang#scala-reflect;2.10.0 from central in [default]
  44. org.scala-lang#scalap;2.10.0 from central in [default]
  45. org.slf4j#slf4j-api;1.7.10 from central in [default]
  46. org.spark-project.spark#unused;1.0.0 from central in [default]
  47. org.xerial.snappy#snappy-java;1.1.2 from central in [default]
  48. ---------------------------------------------------------------------
  49. | | modules || artifacts |
  50. | conf | number| search|dwnlded|evicted|| number|dwnlded|
  51. ---------------------------------------------------------------------
  52. | default | 18 | 0 | 0 | 0 || 18 | 0 |
  53. ---------------------------------------------------------------------
  54. :: retrieving :: org.apache.spark#spark-submit-parent
  55. confs: [default]
  56. 0 artifacts copied, 18 already retrieved (0kB/13ms)
  57. 16/04/04 23:21:55 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  58. 16/04/04 23:21:55 INFO SecurityManager: Changing view acls to: root
  59. 16/04/04 23:21:55 INFO SecurityManager: Changing modify acls to: root
  60. 16/04/04 23:21:55 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
  61. 16/04/04 23:21:55 INFO HttpServer: Starting HTTP Server
  62. 16/04/04 23:21:55 INFO Server: jetty-8.y.z-SNAPSHOT
  63. 16/04/04 23:21:55 INFO AbstractConnector: Started [email protected]:45070
  64. 16/04/04 23:21:55 INFO Utils: Successfully started service 'HTTP class server' on port 45070.
  65. Welcome to
  66. ____ __
  67. / __/__ ___ _____/ /__
  68. _\ \/ _ \/ _ `/ __/ '_/
  69. /___/ .__/\_,_/_/ /_/\_\ version 1.6.0
  70. /_/
  71.  
  72. Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_60)
  73. Type in expressions to have them evaluated.
  74. Type :help for more information.
  75. 16/04/04 23:21:58 INFO SparkContext: Running Spark version 1.6.0
  76. 16/04/04 23:21:58 INFO SecurityManager: Changing view acls to: root
  77. 16/04/04 23:21:58 INFO SecurityManager: Changing modify acls to: root
  78. 16/04/04 23:21:58 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
  79. 16/04/04 23:21:58 INFO Utils: Successfully started service 'sparkDriver' on port 40352.
  80. 16/04/04 23:21:58 INFO Slf4jLogger: Slf4jLogger started
  81. 16/04/04 23:21:58 INFO Remoting: Starting remoting
  82. 16/04/04 23:21:58 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:58038]
  83. 16/04/04 23:21:58 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 58038.
  84. 16/04/04 23:21:58 INFO SparkEnv: Registering MapOutputTracker
  85. 16/04/04 23:21:58 INFO SparkEnv: Registering BlockManagerMaster
  86. 16/04/04 23:21:58 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-d434d784-9295-44b1-a875-a53269c383fc
  87. 16/04/04 23:21:58 INFO MemoryStore: MemoryStore started with capacity 511.1 MB
  88. 16/04/04 23:21:58 INFO SparkEnv: Registering OutputCommitCoordinator
  89. 16/04/04 23:21:58 INFO Server: jetty-8.y.z-SNAPSHOT
  90. 16/04/04 23:21:58 WARN AbstractLifeCycle: FAILED [email protected]:4040: java.net.BindException: Address already in use
  91. java.net.BindException: Address already in use
  92. at sun.nio.ch.Net.bind0(Native Method)
  93. at sun.nio.ch.Net.bind(Net.java:433)
  94. at sun.nio.ch.Net.bind(Net.java:425)
  95. at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
  96. at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
  97. at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
  98. at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
  99. at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
  100. at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
  101. at org.spark-project.jetty.server.Server.doStart(Server.java:293)
  102. at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
  103. at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:252)
  104. at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
  105. at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
  106. at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1964)
  107. at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
  108. at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1955)
  109. at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:262)
  110. at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
  111. at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
  112. at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
  113. at scala.Option.foreach(Option.scala:236)
  114. at org.apache.spark.SparkContext.<init>(SparkContext.scala:481)
  115. at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
  116. at $line3.$read$$iwC$$iwC.<init>(<console>:15)
  117. at $line3.$read$$iwC.<init>(<console>:24)
  118. at $line3.$read.<init>(<console>:26)
  119. at $line3.$read$.<init>(<console>:30)
  120. at $line3.$read$.<clinit>(<console>)
  121. at $line3.$eval$.<init>(<console>:7)
  122. at $line3.$eval$.<clinit>(<console>)
  123. at $line3.$eval.$print(<console>)
  124. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  125. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  126. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  127. at java.lang.reflect.Method.invoke(Method.java:497)
  128. at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
  129. at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
  130. at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
  131. at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
  132. at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
  133. at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
  134. at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
  135. at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
  136. at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
  137. at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
  138. at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
  139. at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
  140. at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
  141. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
  142. at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
  143. at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
  144. at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
  145. at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
  146. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
  147. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
  148. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
  149. at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
  150. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
  151. at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
  152. at org.apache.spark.repl.Main$.main(Main.scala:31)
  153. at org.apache.spark.repl.Main.main(Main.scala)
  154. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  155. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  156. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  157. at java.lang.reflect.Method.invoke(Method.java:497)
  158. at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
  159. at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
  160. at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
  161. at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
  162. at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
  163. 16/04/04 23:21:58 WARN AbstractLifeCycle: FAILED org.spark-project.jetty.server.Server@83e635f: java.net.BindException: Address already in use
  164. java.net.BindException: Address already in use
  165. at sun.nio.ch.Net.bind0(Native Method)
  166. at sun.nio.ch.Net.bind(Net.java:433)
  167. at sun.nio.ch.Net.bind(Net.java:425)
  168. at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
  169. at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
  170. at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
  171. at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
  172. at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
  173. at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
  174. at org.spark-project.jetty.server.Server.doStart(Server.java:293)
  175. at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
  176. at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:252)
  177. at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
  178. at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
  179. at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1964)
  180. at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
  181. at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1955)
  182. at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:262)
  183. at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
  184. at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
  185. at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
  186. at scala.Option.foreach(Option.scala:236)
  187. at org.apache.spark.SparkContext.<init>(SparkContext.scala:481)
  188. at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
  189. at $line3.$read$$iwC$$iwC.<init>(<console>:15)
  190. at $line3.$read$$iwC.<init>(<console>:24)
  191. at $line3.$read.<init>(<console>:26)
  192. at $line3.$read$.<init>(<console>:30)
  193. at $line3.$read$.<clinit>(<console>)
  194. at $line3.$eval$.<init>(<console>:7)
  195. at $line3.$eval$.<clinit>(<console>)
  196. at $line3.$eval.$print(<console>)
  197. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  198. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  199. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  200. at java.lang.reflect.Method.invoke(Method.java:497)
  201. at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
  202. at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
  203. at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
  204. at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
  205. at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
  206. at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
  207. at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
  208. at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
  209. at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
  210. at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
  211. at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
  212. at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
  213. at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
  214. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
  215. at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
  216. at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
  217. at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
  218. at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
  219. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
  220. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
  221. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
  222. at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
  223. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
  224. at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
  225. at org.apache.spark.repl.Main$.main(Main.scala:31)
  226. at org.apache.spark.repl.Main.main(Main.scala)
  227. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  228. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  229. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  230. at java.lang.reflect.Method.invoke(Method.java:497)
  231. at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
  232. at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
  233. at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
  234. at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
  235. at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
  236. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
  237. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
  238. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
  239. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
  240. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
  241. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
  242. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
  243. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
  244. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
  245. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
  246. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
  247. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
  248. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
  249. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
  250. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
  251. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
  252. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
  253. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
  254. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
  255. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
  256. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
  257. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
  258. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
  259. 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
  260. 16/04/04 23:21:58 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
  261. 16/04/04 23:21:58 INFO Server: jetty-8.y.z-SNAPSHOT
  262. 16/04/04 23:21:58 INFO AbstractConnector: Started [email protected]:4041
  263. 16/04/04 23:21:58 INFO Utils: Successfully started service 'SparkUI' on port 4041.
  264. 16/04/04 23:21:58 INFO SparkUI: Started SparkUI at http://192.168.62.232:4041
  265. 16/04/04 23:21:58 INFO HttpFileServer: HTTP File server directory is /tmp/spark-0150e277-9e4d-4428-8edf-31a26ec138cf/httpd-cdb4637f-e6f8-4201-99c7-6e3794c3d88d
  266. 16/04/04 23:21:58 INFO HttpServer: Starting HTTP Server
  267. 16/04/04 23:21:58 INFO Server: jetty-8.y.z-SNAPSHOT
  268. 16/04/04 23:21:58 INFO AbstractConnector: Started [email protected]:54109
  269. 16/04/04 23:21:58 INFO Utils: Successfully started service 'HTTP file server' on port 54109.
  270. 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.apache.spark_spark-streaming-kafka_2.10-1.6.0.jar at http://192.168.62.232:54109/jars/org.apache.spark_spark-streaming-kafka_2.10-1.6.0.jar with timestamp 1459804918951
  271. 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.elasticsearch_elasticsearch-spark_2.10-2.2.0.jar at http://192.168.62.232:54109/jars/org.elasticsearch_elasticsearch-spark_2.10-2.2.0.jar with timestamp 1459804918952
  272. 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.json4s_json4s-native_2.10-3.2.11.jar at http://192.168.62.232:54109/jars/org.json4s_json4s-native_2.10-3.2.11.jar with timestamp 1459804918953
  273. 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.apache.kafka_kafka_2.10-0.8.2.1.jar at http://192.168.62.232:54109/jars/org.apache.kafka_kafka_2.10-0.8.2.1.jar with timestamp 1459804918962
  274. 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.spark-project.spark_unused-1.0.0.jar at http://192.168.62.232:54109/jars/org.spark-project.spark_unused-1.0.0.jar with timestamp 1459804918962
  275. 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/com.yammer.metrics_metrics-core-2.2.0.jar at http://192.168.62.232:54109/jars/com.yammer.metrics_metrics-core-2.2.0.jar with timestamp 1459804918962
  276. 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.apache.kafka_kafka-clients-0.8.2.1.jar at http://192.168.62.232:54109/jars/org.apache.kafka_kafka-clients-0.8.2.1.jar with timestamp 1459804918963
  277. 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/com.101tec_zkclient-0.3.jar at http://192.168.62.232:54109/jars/com.101tec_zkclient-0.3.jar with timestamp 1459804918964
  278. 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.slf4j_slf4j-api-1.7.10.jar at http://192.168.62.232:54109/jars/org.slf4j_slf4j-api-1.7.10.jar with timestamp 1459804918964
  279. 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/net.jpountz.lz4_lz4-1.3.0.jar at http://192.168.62.232:54109/jars/net.jpountz.lz4_lz4-1.3.0.jar with timestamp 1459804918965
  280. 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.xerial.snappy_snappy-java-1.1.2.jar at http://192.168.62.232:54109/jars/org.xerial.snappy_snappy-java-1.1.2.jar with timestamp 1459804918966
  281. 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/log4j_log4j-1.2.17.jar at http://192.168.62.232:54109/jars/log4j_log4j-1.2.17.jar with timestamp 1459804918968
  282. 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.json4s_json4s-core_2.10-3.2.11.jar at http://192.168.62.232:54109/jars/org.json4s_json4s-core_2.10-3.2.11.jar with timestamp 1459804918969
  283. 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.json4s_json4s-ast_2.10-3.2.11.jar at http://192.168.62.232:54109/jars/org.json4s_json4s-ast_2.10-3.2.11.jar with timestamp 1459804918970
  284. 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/com.thoughtworks.paranamer_paranamer-2.6.jar at http://192.168.62.232:54109/jars/com.thoughtworks.paranamer_paranamer-2.6.jar with timestamp 1459804918970
  285. 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.scala-lang_scalap-2.10.0.jar at http://192.168.62.232:54109/jars/org.scala-lang_scalap-2.10.0.jar with timestamp 1459804918972
  286. 16/04/04 23:21:59 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.scala-lang_scala-compiler-2.10.0.jar at http://192.168.62.232:54109/jars/org.scala-lang_scala-compiler-2.10.0.jar with timestamp 1459804919003
  287. 16/04/04 23:21:59 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.scala-lang_scala-reflect-2.10.0.jar at http://192.168.62.232:54109/jars/org.scala-lang_scala-reflect-2.10.0.jar with timestamp 1459804919010
  288. spark.yarn.driver.memoryOverhead is set but does not apply in client mode.
  289. 16/04/04 23:21:59 INFO TimelineClientImpl: Timeline service address: http://compute01.mydomain:8188/ws/v1/timeline/
  290. 16/04/04 23:21:59 INFO RMProxy: Connecting to ResourceManager at compute01.mydomain/192.168.62.232:8050
  291. 16/04/04 23:22:00 WARN DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
  292. 16/04/04 23:22:00 INFO Client: Requesting a new application from cluster with 5 NodeManagers
  293. 16/04/04 23:22:00 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (10240 MB per container)
  294. 16/04/04 23:22:00 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
  295. 16/04/04 23:22:00 INFO Client: Setting up container launch context for our AM
  296. 16/04/04 23:22:00 INFO Client: Setting up the launch environment for our AM container
  297. 16/04/04 23:22:00 WARN Client: No spark assembly jar for HDP on HDFS, defaultSparkAssembly:hdfs://compute01.mydomain:8020/hdp/apps/2.4.0.0-169/spark/spark-hdp-assembly.jar
  298. 16/04/04 23:22:00 INFO Client: Preparing resources for our AM container
  299. 16/04/04 23:22:00 WARN Client: No spark assembly jar for HDP on HDFS, defaultSparkAssembly:hdfs://compute01.mydomain:8020/hdp/apps/2.4.0.0-169/spark/spark-hdp-assembly.jar
  300. 16/04/04 23:22:00 INFO Client: Uploading resource file:/opt/hosting/run/hdp/2.4.0.0-169/spark/lib/spark-assembly-1.6.0.2.4.0.0-169-hadoop2.7.1.2.4.0.0-169.jar -> hdfs://compute01.mydomain:8020/user/root/.sparkStaging/application_1459404819578_0101/spark-assembly-1.6.0.2.4.0.0-169-hadoop2.7.1.2.4.0.0-169.jar
  301. 16/04/04 23:22:02 INFO Client: Uploading resource file:/tmp/spark-0150e277-9e4d-4428-8edf-31a26ec138cf/__spark_conf__8928795395569657698.zip -> hdfs://compute01.mydomain:8020/user/root/.sparkStaging/application_1459404819578_0101/__spark_conf__8928795395569657698.zip
  302. 16/04/04 23:22:02 INFO SecurityManager: Changing view acls to: root
  303. 16/04/04 23:22:02 INFO SecurityManager: Changing modify acls to: root
  304. 16/04/04 23:22:02 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
  305. 16/04/04 23:22:02 INFO Client: Submitting application 101 to ResourceManager
  306. 16/04/04 23:22:02 INFO YarnClientImpl: Submitted application application_1459404819578_0101
  307. 16/04/04 23:22:02 INFO SchedulerExtensionServices: Starting Yarn extension services with app application_1459404819578_0101 and attemptId None
  308. 16/04/04 23:22:03 INFO Client: Application report for application_1459404819578_0101 (state: ACCEPTED)
  309. 16/04/04 23:22:03 INFO Client:
  310. client token: N/A
  311. diagnostics: N/A
  312. ApplicationMaster host: N/A
  313. ApplicationMaster RPC port: -1
  314. queue: default
  315. start time: 1459804922514
  316. final status: UNDEFINED
  317. tracking URL: http://compute01.mydomain:8088/proxy/application_1459404819578_0101/
  318. user: root
  319. 16/04/04 23:22:04 INFO Client: Application report for application_1459404819578_0101 (state: ACCEPTED)
  320. 16/04/04 23:22:05 INFO Client: Application report for application_1459404819578_0101 (state: ACCEPTED)
  321. 16/04/04 23:22:06 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(null)
  322. 16/04/04 23:22:06 INFO YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> compute01.mydomain, PROXY_URI_BASES -> http://compute01.mydomain:8088/proxy/application_1459404819578_0101), /proxy/application_1459404819578_0101
  323. 16/04/04 23:22:06 INFO JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
  324. 16/04/04 23:22:06 INFO Client: Application report for application_1459404819578_0101 (state: RUNNING)
  325. 16/04/04 23:22:06 INFO Client:
  326. client token: N/A
  327. diagnostics: N/A
  328. ApplicationMaster host: 192.168.62.216
  329. ApplicationMaster RPC port: 0
  330. queue: default
  331. start time: 1459804922514
  332. final status: UNDEFINED
  333. tracking URL: http://compute01.mydomain:8088/proxy/application_1459404819578_0101/
  334. user: root
  335. 16/04/04 23:22:06 INFO YarnClientSchedulerBackend: Application application_1459404819578_0101 has started running.
  336. 16/04/04 23:22:06 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 35299.
  337. 16/04/04 23:22:06 INFO NettyBlockTransferService: Server created on 35299
  338. 16/04/04 23:22:06 INFO BlockManagerMaster: Trying to register BlockManager
  339. 16/04/04 23:22:06 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.62.232:35299 with 511.1 MB RAM, BlockManagerId(driver, 192.168.62.232, 35299)
  340. 16/04/04 23:22:06 INFO BlockManagerMaster: Registered BlockManager
  341. 16/04/04 23:22:06 INFO EventLoggingListener: Logging events to hdfs:///spark-history/application_1459404819578_0101
  342. 16/04/04 23:22:09 INFO YarnClientSchedulerBackend: Registered executor NettyRpcEndpointRef(null) (compute02.mydomain:38367) with ID 1
  343. 16/04/04 23:22:09 INFO BlockManagerMasterEndpoint: Registering block manager compute02.mydomain:54410 with 511.1 MB RAM, BlockManagerId(1, compute02.mydomain, 54410)
  344. 16/04/04 23:22:10 INFO YarnClientSchedulerBackend: Registered executor NettyRpcEndpointRef(null) (compute04.mydomain:47907) with ID 2
  345. 16/04/04 23:22:10 INFO BlockManagerMasterEndpoint: Registering block manager compute04.mydomain:49073 with 511.1 MB RAM, BlockManagerId(2, compute04.mydomain, 49073)
  346. 16/04/04 23:22:10 INFO YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
  347. 16/04/04 23:22:10 INFO SparkILoop: Created spark context..
  348. Spark context available as sc.
  349. 16/04/04 23:22:11 INFO HiveContext: Initializing execution hive, version 1.2.1
  350. 16/04/04 23:22:11 INFO ClientWrapper: Inspected Hadoop version: 2.7.1.2.4.0.0-169
  351. 16/04/04 23:22:11 INFO ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.7.1.2.4.0.0-169
  352. 16/04/04 23:22:11 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
  353. 16/04/04 23:22:11 INFO ObjectStore: ObjectStore, initialize called
  354. 16/04/04 23:22:11 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
  355. 16/04/04 23:22:11 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
  356. 16/04/04 23:22:11 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
  357. 16/04/04 23:22:12 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
  358. 16/04/04 23:22:12 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
  359. 16/04/04 23:22:13 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
  360. 16/04/04 23:22:13 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
  361. 16/04/04 23:22:14 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
  362. 16/04/04 23:22:14 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
  363. 16/04/04 23:22:14 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
  364. 16/04/04 23:22:14 INFO ObjectStore: Initialized ObjectStore
  365. 16/04/04 23:22:14 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
  366. 16/04/04 23:22:14 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
  367. 16/04/04 23:22:14 INFO HiveMetaStore: Added admin role in metastore
  368. 16/04/04 23:22:14 INFO HiveMetaStore: Added public role in metastore
  369. 16/04/04 23:22:14 INFO HiveMetaStore: No user is added in admin role, since config is empty
  370. 16/04/04 23:22:14 INFO HiveMetaStore: 0: get_all_databases
  371. 16/04/04 23:22:14 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_all_databases
  372. 16/04/04 23:22:14 INFO HiveMetaStore: 0: get_functions: db=default pat=*
  373. 16/04/04 23:22:14 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_functions: db=default pat=*
  374. 16/04/04 23:22:14 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as "embedded-only" so does not have its own datastore table.
  375. 16/04/04 23:22:14 INFO SessionState: Created local directory: /tmp/8a1c78d8-7182-4b8d-a17b-a1193517d5ff_resources
  376. 16/04/04 23:22:14 INFO SessionState: Created HDFS directory: /tmp/hive/root/8a1c78d8-7182-4b8d-a17b-a1193517d5ff
  377. 16/04/04 23:22:14 INFO SessionState: Created local directory: /tmp/root/8a1c78d8-7182-4b8d-a17b-a1193517d5ff
  378. 16/04/04 23:22:14 INFO SessionState: Created HDFS directory: /tmp/hive/root/8a1c78d8-7182-4b8d-a17b-a1193517d5ff/_tmp_space.db
  379. 16/04/04 23:22:15 INFO HiveContext: default warehouse location is /user/hive/warehouse
  380. 16/04/04 23:22:15 INFO HiveContext: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
  381. 16/04/04 23:22:15 INFO ClientWrapper: Inspected Hadoop version: 2.7.1.2.4.0.0-169
  382. 16/04/04 23:22:15 INFO ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.7.1.2.4.0.0-169
  383. 16/04/04 23:22:15 INFO metastore: Trying to connect to metastore with URI thrift://compute01.mydomain:9083
  384. 16/04/04 23:22:15 INFO metastore: Connected to metastore.
  385. 16/04/04 23:22:15 INFO SessionState: Created local directory: /tmp/b9e8d651-f8f5-4fc6-a3e3-b63c54be5f04_resources
  386. 16/04/04 23:22:15 INFO SessionState: Created HDFS directory: /tmp/hive/root/b9e8d651-f8f5-4fc6-a3e3-b63c54be5f04
  387. 16/04/04 23:22:15 INFO SessionState: Created local directory: /tmp/root/b9e8d651-f8f5-4fc6-a3e3-b63c54be5f04
  388. 16/04/04 23:22:15 INFO SessionState: Created HDFS directory: /tmp/hive/root/b9e8d651-f8f5-4fc6-a3e3-b63c54be5f04/_tmp_space.db
  389. 16/04/04 23:22:15 INFO SparkILoop: Created sql context (with Hive support)..
  390. SQL context available as sqlContext.
  391. Loading test.scala...
  392. import org.apache.spark._
  393. import org.apache.spark.streaming._
  394. app: String = test-scala
  395. conf: org.apache.spark.SparkConf = org.apache.spark.SparkConf@59fe8d94
  396. 16/04/04 23:22:17 INFO SparkContext: Running Spark version 1.6.0
  397. 16/04/04 23:22:17 INFO SecurityManager: Changing view acls to: root
  398. 16/04/04 23:22:17 INFO SecurityManager: Changing modify acls to: root
  399. 16/04/04 23:22:17 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
  400. 16/04/04 23:22:17 INFO Utils: Successfully started service 'sparkDriver' on port 60455.
  401. 16/04/04 23:22:17 INFO Slf4jLogger: Slf4jLogger started
  402. 16/04/04 23:22:17 INFO Remoting: Starting remoting
  403. 16/04/04 23:22:17 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:38419]
  404. 16/04/04 23:22:17 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 38419.
  405. 16/04/04 23:22:17 INFO SparkEnv: Registering MapOutputTracker
  406. 16/04/04 23:22:17 INFO SparkEnv: Registering BlockManagerMaster
  407. 16/04/04 23:22:17 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-2a145cf9-d3f7-4aa7-a1b4-1c9544ae89b6
  408. 16/04/04 23:22:17 INFO MemoryStore: MemoryStore started with capacity 458.6 MB
  409. 16/04/04 23:22:17 INFO SparkEnv: Registering OutputCommitCoordinator
  410. 16/04/04 23:22:17 INFO Server: jetty-8.y.z-SNAPSHOT
  411. 16/04/04 23:22:17 WARN AbstractLifeCycle: FAILED [email protected]:4040: java.net.BindException: Address already in use
  412. java.net.BindException: Address already in use
  413. at sun.nio.ch.Net.bind0(Native Method)
  414. at sun.nio.ch.Net.bind(Net.java:433)
  415. at sun.nio.ch.Net.bind(Net.java:425)
  416. at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
  417. at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
  418. at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
  419. at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
  420. at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
  421. at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
  422. at org.spark-project.jetty.server.Server.doStart(Server.java:293)
  423. at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
  424. at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:252)
  425. at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
  426. at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
  427. at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1964)
  428. at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
  429. at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1955)
  430. at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:262)
  431. at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
  432. at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
  433. at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
  434. at scala.Option.foreach(Option.scala:236)
  435. at org.apache.spark.SparkContext.<init>(SparkContext.scala:481)
  436. at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
  437. at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
  438. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
  439. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
  440. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
  441. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)
  442. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
  443. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
  444. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:50)
  445. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:52)
  446. at $line34.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:54)
  447. at $line34.$read$$iwC$$iwC$$iwC.<init>(<console>:56)
  448. at $line34.$read$$iwC$$iwC.<init>(<console>:58)
  449. at $line34.$read$$iwC.<init>(<console>:60)
  450. at $line34.$read.<init>(<console>:62)
  451. at $line34.$read$.<init>(<console>:66)
  452. at $line34.$read$.<clinit>(<console>)
  453. at $line34.$eval$.<init>(<console>:7)
  454. at $line34.$eval$.<clinit>(<console>)
  455. at $line34.$eval.$print(<console>)
  456. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  457. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  458. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  459. at java.lang.reflect.Method.invoke(Method.java:497)
  460. at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
  461. at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
  462. at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
  463. at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
  464. at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
  465. at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
  466. at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
  467. at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
  468. at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
  469. at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
  470. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
  471. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:680)
  472. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:677)
  473. at scala.reflect.io.Streamable$Chars$class.applyReader(Streamable.scala:104)
  474. at scala.reflect.io.File.applyReader(File.scala:82)
  475. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SparkILoop.scala:677)
  476. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:677)
  477. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:677)
  478. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$savingReplayStack(SparkILoop.scala:162)
  479. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply$mcV$sp(SparkILoop.scala:676)
  480. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply(SparkILoop.scala:676)
  481. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply(SparkILoop.scala:676)
  482. at org.apache.spark.repl.SparkILoop.savingReader(SparkILoop.scala:167)
  483. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$interpretAllFrom(SparkILoop.scala:675)
  484. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadCommand$1.apply(SparkILoop.scala:740)
  485. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadCommand$1.apply(SparkILoop.scala:739)
  486. at org.apache.spark.repl.SparkILoop.withFile(SparkILoop.scala:733)
  487. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loadCommand(SparkILoop.scala:739)
  488. at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$7.apply(SparkILoop.scala:344)
  489. at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$7.apply(SparkILoop.scala:344)
  490. at scala.tools.nsc.interpreter.LoopCommands$LineCmd.apply(LoopCommands.scala:81)
  491. at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:809)
  492. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadFiles$1.apply(SparkILoop.scala:910)
  493. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadFiles$1.apply(SparkILoop.scala:908)
  494. at scala.collection.immutable.List.foreach(List.scala:318)
  495. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loadFiles(SparkILoop.scala:908)
  496. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:995)
  497. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
  498. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
  499. at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
  500. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
  501. at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
  502. at org.apache.spark.repl.Main$.main(Main.scala:31)
  503. at org.apache.spark.repl.Main.main(Main.scala)
  504. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  505. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  506. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  507. at java.lang.reflect.Method.invoke(Method.java:497)
  508. at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
  509. at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
  510. at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
  511. at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
  512. at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
  513. 16/04/04 23:22:17 WARN AbstractLifeCycle: FAILED org.spark-project.jetty.server.Server@196e494d: java.net.BindException: Address already in use
  514. java.net.BindException: Address already in use
  515. at sun.nio.ch.Net.bind0(Native Method)
  516. at sun.nio.ch.Net.bind(Net.java:433)
  517. at sun.nio.ch.Net.bind(Net.java:425)
  518. at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
  519. at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
  520. at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
  521. at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
  522. at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
  523. at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
  524. at org.spark-project.jetty.server.Server.doStart(Server.java:293)
  525. at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
  526. at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:252)
  527. at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
  528. at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
  529. at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1964)
  530. at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
  531. at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1955)
  532. at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:262)
  533. at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
  534. at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
  535. at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
  536. at scala.Option.foreach(Option.scala:236)
  537. at org.apache.spark.SparkContext.<init>(SparkContext.scala:481)
  538. at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
  539. at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
  540. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
  541. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
  542. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
  543. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)
  544. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
  545. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
  546. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:50)
  547. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:52)
  548. at $line34.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:54)
  549. at $line34.$read$$iwC$$iwC$$iwC.<init>(<console>:56)
  550. at $line34.$read$$iwC$$iwC.<init>(<console>:58)
  551. at $line34.$read$$iwC.<init>(<console>:60)
  552. at $line34.$read.<init>(<console>:62)
  553. at $line34.$read$.<init>(<console>:66)
  554. at $line34.$read$.<clinit>(<console>)
  555. at $line34.$eval$.<init>(<console>:7)
  556. at $line34.$eval$.<clinit>(<console>)
  557. at $line34.$eval.$print(<console>)
  558. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  559. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  560. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  561. at java.lang.reflect.Method.invoke(Method.java:497)
  562. at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
  563. at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
  564. at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
  565. at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
  566. at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
  567. at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
  568. at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
  569. at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
  570. at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
  571. at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
  572. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
  573. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:680)
  574. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:677)
  575. at scala.reflect.io.Streamable$Chars$class.applyReader(Streamable.scala:104)
  576. at scala.reflect.io.File.applyReader(File.scala:82)
  577. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SparkILoop.scala:677)
  578. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:677)
  579. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:677)
  580. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$savingReplayStack(SparkILoop.scala:162)
  581. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply$mcV$sp(SparkILoop.scala:676)
  582. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply(SparkILoop.scala:676)
  583. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply(SparkILoop.scala:676)
  584. at org.apache.spark.repl.SparkILoop.savingReader(SparkILoop.scala:167)
  585. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$interpretAllFrom(SparkILoop.scala:675)
  586. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadCommand$1.apply(SparkILoop.scala:740)
  587. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadCommand$1.apply(SparkILoop.scala:739)
  588. at org.apache.spark.repl.SparkILoop.withFile(SparkILoop.scala:733)
  589. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loadCommand(SparkILoop.scala:739)
  590. at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$7.apply(SparkILoop.scala:344)
  591. at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$7.apply(SparkILoop.scala:344)
  592. at scala.tools.nsc.interpreter.LoopCommands$LineCmd.apply(LoopCommands.scala:81)
  593. at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:809)
  594. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadFiles$1.apply(SparkILoop.scala:910)
  595. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadFiles$1.apply(SparkILoop.scala:908)
  596. at scala.collection.immutable.List.foreach(List.scala:318)
  597. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loadFiles(SparkILoop.scala:908)
  598. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:995)
  599. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
  600. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
  601. at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
  602. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
  603. at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
  604. at org.apache.spark.repl.Main$.main(Main.scala:31)
  605. at org.apache.spark.repl.Main.main(Main.scala)
  606. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  607. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  608. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  609. at java.lang.reflect.Method.invoke(Method.java:497)
  610. at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
  611. at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
  612. at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
  613. at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
  614. at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
  615. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
  616. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
  617. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
  618. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
  619. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
  620. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
  621. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
  622. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
  623. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
  624. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
  625. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
  626. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
  627. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
  628. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
  629. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
  630. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
  631. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
  632. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
  633. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
  634. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
  635. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
  636. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
  637. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
  638. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
  639. 16/04/04 23:22:17 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
  640. 16/04/04 23:22:17 INFO Server: jetty-8.y.z-SNAPSHOT
  641. 16/04/04 23:22:17 WARN AbstractLifeCycle: FAILED [email protected]:4041: java.net.BindException: Address already in use
  642. java.net.BindException: Address already in use
  643. at sun.nio.ch.Net.bind0(Native Method)
  644. at sun.nio.ch.Net.bind(Net.java:433)
  645. at sun.nio.ch.Net.bind(Net.java:425)
  646. at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
  647. at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
  648. at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
  649. at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
  650. at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
  651. at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
  652. at org.spark-project.jetty.server.Server.doStart(Server.java:293)
  653. at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
  654. at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:252)
  655. at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
  656. at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
  657. at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1964)
  658. at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
  659. at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1955)
  660. at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:262)
  661. at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
  662. at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
  663. at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
  664. at scala.Option.foreach(Option.scala:236)
  665. at org.apache.spark.SparkContext.<init>(SparkContext.scala:481)
  666. at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
  667. at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
  668. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
  669. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
  670. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
  671. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)
  672. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
  673. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
  674. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:50)
  675. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:52)
  676. at $line34.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:54)
  677. at $line34.$read$$iwC$$iwC$$iwC.<init>(<console>:56)
  678. at $line34.$read$$iwC$$iwC.<init>(<console>:58)
  679. at $line34.$read$$iwC.<init>(<console>:60)
  680. at $line34.$read.<init>(<console>:62)
  681. at $line34.$read$.<init>(<console>:66)
  682. at $line34.$read$.<clinit>(<console>)
  683. at $line34.$eval$.<init>(<console>:7)
  684. at $line34.$eval$.<clinit>(<console>)
  685. at $line34.$eval.$print(<console>)
  686. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  687. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  688. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  689. at java.lang.reflect.Method.invoke(Method.java:497)
  690. at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
  691. at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
  692. at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
  693. at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
  694. at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
  695. at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
  696. at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
  697. at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
  698. at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
  699. at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
  700. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
  701. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:680)
  702. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:677)
  703. at scala.reflect.io.Streamable$Chars$class.applyReader(Streamable.scala:104)
  704. at scala.reflect.io.File.applyReader(File.scala:82)
  705. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SparkILoop.scala:677)
  706. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:677)
  707. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:677)
  708. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$savingReplayStack(SparkILoop.scala:162)
  709. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply$mcV$sp(SparkILoop.scala:676)
  710. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply(SparkILoop.scala:676)
  711. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply(SparkILoop.scala:676)
  712. at org.apache.spark.repl.SparkILoop.savingReader(SparkILoop.scala:167)
  713. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$interpretAllFrom(SparkILoop.scala:675)
  714. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadCommand$1.apply(SparkILoop.scala:740)
  715. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadCommand$1.apply(SparkILoop.scala:739)
  716. at org.apache.spark.repl.SparkILoop.withFile(SparkILoop.scala:733)
  717. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loadCommand(SparkILoop.scala:739)
  718. at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$7.apply(SparkILoop.scala:344)
  719. at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$7.apply(SparkILoop.scala:344)
  720. at scala.tools.nsc.interpreter.LoopCommands$LineCmd.apply(LoopCommands.scala:81)
  721. at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:809)
  722. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadFiles$1.apply(SparkILoop.scala:910)
  723. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadFiles$1.apply(SparkILoop.scala:908)
  724. at scala.collection.immutable.List.foreach(List.scala:318)
  725. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loadFiles(SparkILoop.scala:908)
  726. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:995)
  727. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
  728. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
  729. at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
  730. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
  731. at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
  732. at org.apache.spark.repl.Main$.main(Main.scala:31)
  733. at org.apache.spark.repl.Main.main(Main.scala)
  734. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  735. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  736. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  737. at java.lang.reflect.Method.invoke(Method.java:497)
  738. at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
  739. at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
  740. at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
  741. at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
  742. at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
  743. 16/04/04 23:22:17 WARN AbstractLifeCycle: FAILED org.spark-project.jetty.server.Server@58c7bb01: java.net.BindException: Address already in use
  744. java.net.BindException: Address already in use
  745. at sun.nio.ch.Net.bind0(Native Method)
  746. at sun.nio.ch.Net.bind(Net.java:433)
  747. at sun.nio.ch.Net.bind(Net.java:425)
  748. at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
  749. at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
  750. at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
  751. at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
  752. at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
  753. at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
  754. at org.spark-project.jetty.server.Server.doStart(Server.java:293)
  755. at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
  756. at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:252)
  757. at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
  758. at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
  759. at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1964)
  760. at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
  761. at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1955)
  762. at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:262)
  763. at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
  764. at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
  765. at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
  766. at scala.Option.foreach(Option.scala:236)
  767. at org.apache.spark.SparkContext.<init>(SparkContext.scala:481)
  768. at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
  769. at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
  770. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
  771. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
  772. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
  773. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)
  774. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
  775. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
  776. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:50)
  777. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:52)
  778. at $line34.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:54)
  779. at $line34.$read$$iwC$$iwC$$iwC.<init>(<console>:56)
  780. at $line34.$read$$iwC$$iwC.<init>(<console>:58)
  781. at $line34.$read$$iwC.<init>(<console>:60)
  782. at $line34.$read.<init>(<console>:62)
  783. at $line34.$read$.<init>(<console>:66)
  784. at $line34.$read$.<clinit>(<console>)
  785. at $line34.$eval$.<init>(<console>:7)
  786. at $line34.$eval$.<clinit>(<console>)
  787. at $line34.$eval.$print(<console>)
  788. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  789. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  790. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  791. at java.lang.reflect.Method.invoke(Method.java:497)
  792. at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
  793. at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
  794. at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
  795. at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
  796. at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
  797. at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
  798. at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
  799. at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
  800. at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
  801. at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
  802. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
  803. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:680)
  804. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:677)
  805. at scala.reflect.io.Streamable$Chars$class.applyReader(Streamable.scala:104)
  806. at scala.reflect.io.File.applyReader(File.scala:82)
  807. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SparkILoop.scala:677)
  808. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:677)
  809. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:677)
  810. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$savingReplayStack(SparkILoop.scala:162)
  811. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply$mcV$sp(SparkILoop.scala:676)
  812. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply(SparkILoop.scala:676)
  813. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply(SparkILoop.scala:676)
  814. at org.apache.spark.repl.SparkILoop.savingReader(SparkILoop.scala:167)
  815. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$interpretAllFrom(SparkILoop.scala:675)
  816. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadCommand$1.apply(SparkILoop.scala:740)
  817. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadCommand$1.apply(SparkILoop.scala:739)
  818. at org.apache.spark.repl.SparkILoop.withFile(SparkILoop.scala:733)
  819. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loadCommand(SparkILoop.scala:739)
  820. at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$7.apply(SparkILoop.scala:344)
  821. at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$7.apply(SparkILoop.scala:344)
  822. at scala.tools.nsc.interpreter.LoopCommands$LineCmd.apply(LoopCommands.scala:81)
  823. at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:809)
  824. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadFiles$1.apply(SparkILoop.scala:910)
  825. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadFiles$1.apply(SparkILoop.scala:908)
  826. at scala.collection.immutable.List.foreach(List.scala:318)
  827. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loadFiles(SparkILoop.scala:908)
  828. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:995)
  829. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
  830. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
  831. at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
  832. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
  833. at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
  834. at org.apache.spark.repl.Main$.main(Main.scala:31)
  835. at org.apache.spark.repl.Main.main(Main.scala)
  836. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  837. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  838. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  839. at java.lang.reflect.Method.invoke(Method.java:497)
  840. at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
  841. at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
  842. at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
  843. at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
  844. at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
  845. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
  846. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
  847. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
  848. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
  849. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
  850. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
  851. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
  852. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
  853. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
  854. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
  855. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
  856. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
  857. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
  858. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
  859. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
  860. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
  861. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
  862. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
  863. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
  864. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
  865. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
  866. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
  867. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
  868. 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
  869. 16/04/04 23:22:17 WARN Utils: Service 'SparkUI' could not bind on port 4041. Attempting port 4042.
  870. 16/04/04 23:22:17 INFO Server: jetty-8.y.z-SNAPSHOT
  871. 16/04/04 23:22:17 INFO AbstractConnector: Started [email protected]:4042
  872. 16/04/04 23:22:17 INFO Utils: Successfully started service 'SparkUI' on port 4042.
  873. 16/04/04 23:22:17 INFO SparkUI: Started SparkUI at http://192.168.62.232:4042
  874. 16/04/04 23:22:17 INFO HttpFileServer: HTTP File server directory is /tmp/spark-0150e277-9e4d-4428-8edf-31a26ec138cf/httpd-431dcfa2-e53c-4e9c-8177-e4d6c11d4e9d
  875. 16/04/04 23:22:17 INFO HttpServer: Starting HTTP Server
  876. 16/04/04 23:22:17 INFO Server: jetty-8.y.z-SNAPSHOT
  877. 16/04/04 23:22:17 INFO AbstractConnector: Started [email protected]:53366
  878. 16/04/04 23:22:17 INFO Utils: Successfully started service 'HTTP file server' on port 53366.
  879. 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.apache.spark_spark-streaming-kafka_2.10-1.6.0.jar at http://192.168.62.232:53366/jars/org.apache.spark_spark-streaming-kafka_2.10-1.6.0.jar with timestamp 1459804937765
  880. 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.elasticsearch_elasticsearch-spark_2.10-2.2.0.jar at http://192.168.62.232:53366/jars/org.elasticsearch_elasticsearch-spark_2.10-2.2.0.jar with timestamp 1459804937766
  881. 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.json4s_json4s-native_2.10-3.2.11.jar at http://192.168.62.232:53366/jars/org.json4s_json4s-native_2.10-3.2.11.jar with timestamp 1459804937766
  882. 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.apache.kafka_kafka_2.10-0.8.2.1.jar at http://192.168.62.232:53366/jars/org.apache.kafka_kafka_2.10-0.8.2.1.jar with timestamp 1459804937774
  883. 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.spark-project.spark_unused-1.0.0.jar at http://192.168.62.232:53366/jars/org.spark-project.spark_unused-1.0.0.jar with timestamp 1459804937775
  884. 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/com.yammer.metrics_metrics-core-2.2.0.jar at http://192.168.62.232:53366/jars/com.yammer.metrics_metrics-core-2.2.0.jar with timestamp 1459804937775
  885. 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.apache.kafka_kafka-clients-0.8.2.1.jar at http://192.168.62.232:53366/jars/org.apache.kafka_kafka-clients-0.8.2.1.jar with timestamp 1459804937776
  886. 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/com.101tec_zkclient-0.3.jar at http://192.168.62.232:53366/jars/com.101tec_zkclient-0.3.jar with timestamp 1459804937776
  887. 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.slf4j_slf4j-api-1.7.10.jar at http://192.168.62.232:53366/jars/org.slf4j_slf4j-api-1.7.10.jar with timestamp 1459804937776
  888. 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/net.jpountz.lz4_lz4-1.3.0.jar at http://192.168.62.232:53366/jars/net.jpountz.lz4_lz4-1.3.0.jar with timestamp 1459804937777
  889. 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.xerial.snappy_snappy-java-1.1.2.jar at http://192.168.62.232:53366/jars/org.xerial.snappy_snappy-java-1.1.2.jar with timestamp 1459804937779
  890. 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/log4j_log4j-1.2.17.jar at http://192.168.62.232:53366/jars/log4j_log4j-1.2.17.jar with timestamp 1459804937780
  891. 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.json4s_json4s-core_2.10-3.2.11.jar at http://192.168.62.232:53366/jars/org.json4s_json4s-core_2.10-3.2.11.jar with timestamp 1459804937781
  892. 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.json4s_json4s-ast_2.10-3.2.11.jar at http://192.168.62.232:53366/jars/org.json4s_json4s-ast_2.10-3.2.11.jar with timestamp 1459804937781
  893. 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/com.thoughtworks.paranamer_paranamer-2.6.jar at http://192.168.62.232:53366/jars/com.thoughtworks.paranamer_paranamer-2.6.jar with timestamp 1459804937782
  894. 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.scala-lang_scalap-2.10.0.jar at http://192.168.62.232:53366/jars/org.scala-lang_scalap-2.10.0.jar with timestamp 1459804937784
  895. 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.scala-lang_scala-compiler-2.10.0.jar at http://192.168.62.232:53366/jars/org.scala-lang_scala-compiler-2.10.0.jar with timestamp 1459804937813
  896. 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.scala-lang_scala-reflect-2.10.0.jar at http://192.168.62.232:53366/jars/org.scala-lang_scala-reflect-2.10.0.jar with timestamp 1459804937819
  897. spark.yarn.driver.memoryOverhead is set but does not apply in client mode.
  898. 16/04/04 23:22:17 INFO TimelineClientImpl: Timeline service address: http://compute01.mydomain:8188/ws/v1/timeline/
  899. 16/04/04 23:22:17 INFO RMProxy: Connecting to ResourceManager at compute01.mydomain/192.168.62.232:8050
  900. 16/04/04 23:22:17 INFO Client: Requesting a new application from cluster with 5 NodeManagers
  901. 16/04/04 23:22:17 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (10240 MB per container)
  902. 16/04/04 23:22:17 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
  903. 16/04/04 23:22:17 INFO Client: Setting up container launch context for our AM
  904. 16/04/04 23:22:17 INFO Client: Setting up the launch environment for our AM container
  905. 16/04/04 23:22:17 WARN Client: No spark assembly jar for HDP on HDFS, defaultSparkAssembly:hdfs://compute01.mydomain:8020/hdp/apps/2.4.0.0-169/spark/spark-hdp-assembly.jar
  906. 16/04/04 23:22:17 INFO Client: Preparing resources for our AM container
  907. 16/04/04 23:22:17 WARN Client: No spark assembly jar for HDP on HDFS, defaultSparkAssembly:hdfs://compute01.mydomain:8020/hdp/apps/2.4.0.0-169/spark/spark-hdp-assembly.jar
  908. 16/04/04 23:22:17 INFO Client: Uploading resource file:/opt/hosting/run/hdp/2.4.0.0-169/spark/lib/spark-assembly-1.6.0.2.4.0.0-169-hadoop2.7.1.2.4.0.0-169.jar -> hdfs://compute01.mydomain:8020/user/root/.sparkStaging/application_1459404819578_0102/spark-assembly-1.6.0.2.4.0.0-169-hadoop2.7.1.2.4.0.0-169.jar
  909. 16/04/04 23:22:19 INFO Client: Uploading resource file:/tmp/spark-0150e277-9e4d-4428-8edf-31a26ec138cf/__spark_conf__7928248898142720439.zip -> hdfs://compute01.mydomain:8020/user/root/.sparkStaging/application_1459404819578_0102/__spark_conf__7928248898142720439.zip
  910. 16/04/04 23:22:20 INFO SecurityManager: Changing view acls to: root
  911. 16/04/04 23:22:20 INFO SecurityManager: Changing modify acls to: root
  912. 16/04/04 23:22:20 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
  913. 16/04/04 23:22:20 INFO Client: Submitting application 102 to ResourceManager
  914. 16/04/04 23:22:20 INFO YarnClientImpl: Submitted application application_1459404819578_0102
  915. 16/04/04 23:22:20 INFO SchedulerExtensionServices: Starting Yarn extension services with app application_1459404819578_0102 and attemptId None
  916. 16/04/04 23:22:21 INFO Client: Application report for application_1459404819578_0102 (state: ACCEPTED)
  917. 16/04/04 23:22:21 INFO Client:
  918. client token: N/A
  919. diagnostics: N/A
  920. ApplicationMaster host: N/A
  921. ApplicationMaster RPC port: -1
  922. queue: default
  923. start time: 1459804940026
  924. final status: UNDEFINED
  925. tracking URL: http://compute01.mydomain:8088/proxy/application_1459404819578_0102/
  926. user: root
  927. 16/04/04 23:22:22 INFO Client: Application report for application_1459404819578_0102 (state: ACCEPTED)
  928. 16/04/04 23:22:23 INFO Client: Application report for application_1459404819578_0102 (state: ACCEPTED)
  929. 16/04/04 23:22:23 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(null)
  930. 16/04/04 23:22:23 INFO YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> compute01.mydomain, PROXY_URI_BASES -> http://compute01.mydomain:8088/proxy/application_1459404819578_0102), /proxy/application_1459404819578_0102
  931. 16/04/04 23:22:23 INFO JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
  932. 16/04/04 23:22:24 INFO Client: Application report for application_1459404819578_0102 (state: RUNNING)
  933. 16/04/04 23:22:24 INFO Client:
  934. client token: N/A
  935. diagnostics: N/A
  936. ApplicationMaster host: 192.168.62.235
  937. ApplicationMaster RPC port: 0
  938. queue: default
  939. start time: 1459804940026
  940. final status: UNDEFINED
  941. tracking URL: http://compute01.mydomain:8088/proxy/application_1459404819578_0102/
  942. user: root
  943. 16/04/04 23:22:24 INFO YarnClientSchedulerBackend: Application application_1459404819578_0102 has started running.
  944. 16/04/04 23:22:24 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 56680.
  945. 16/04/04 23:22:24 INFO NettyBlockTransferService: Server created on 56680
  946. 16/04/04 23:22:24 INFO BlockManagerMaster: Trying to register BlockManager
  947. 16/04/04 23:22:24 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.62.232:56680 with 458.6 MB RAM, BlockManagerId(driver, 192.168.62.232, 56680)
  948. 16/04/04 23:22:24 INFO BlockManagerMaster: Registered BlockManager
  949. 16/04/04 23:22:24 INFO EventLoggingListener: Logging events to hdfs:///spark-history/application_1459404819578_0102
  950. 16/04/04 23:22:26 INFO YarnClientSchedulerBackend: Registered executor NettyRpcEndpointRef(null) (compute02.mydomain:47398) with ID 1
  951. 16/04/04 23:22:26 INFO BlockManagerMasterEndpoint: Registering block manager compute02.mydomain:40262 with 511.1 MB RAM, BlockManagerId(1, compute02.mydomain, 40262)
  952. 16/04/04 23:22:27 INFO YarnClientSchedulerBackend: Registered executor NettyRpcEndpointRef(null) (192.168.62.201:36520) with ID 2
  953. 16/04/04 23:22:28 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.62.201:38977 with 511.1 MB RAM, BlockManagerId(2, 192.168.62.201, 38977)
  954. 16/04/04 23:22:28 INFO YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
  955. 16/04/04 23:22:28 WARN SparkContext: Multiple running SparkContexts detected in the same JVM!
  956. org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
  957. org.apache.spark.SparkContext.<init>(SparkContext.scala:82)
  958. org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
  959. $line3.$read$$iwC$$iwC.<init>(<console>:15)
  960. $line3.$read$$iwC.<init>(<console>:24)
  961. $line3.$read.<init>(<console>:26)
  962. $line3.$read$.<init>(<console>:30)
  963. $line3.$read$.<clinit>(<console>)
  964. $line3.$eval$.<init>(<console>:7)
  965. $line3.$eval$.<clinit>(<console>)
  966. $line3.$eval.$print(<console>)
  967. sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  968. sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  969. sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  970. java.lang.reflect.Method.invoke(Method.java:497)
  971. org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
  972. org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
  973. org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
  974. org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
  975. org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
  976. org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
  977. at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:2257)
  978. at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:2239)
  979. at scala.Option.foreach(Option.scala:236)
  980. at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2239)
  981. at org.apache.spark.SparkContext$.setActiveContext(SparkContext.scala:2325)
  982. at org.apache.spark.SparkContext.<init>(SparkContext.scala:2197)
  983. at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
  984. at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
  985. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
  986. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
  987. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
  988. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)
  989. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
  990. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
  991. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:50)
  992. at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:52)
  993. at $line34.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:54)
  994. at $line34.$read$$iwC$$iwC$$iwC.<init>(<console>:56)
  995. at $line34.$read$$iwC$$iwC.<init>(<console>:58)
  996. at $line34.$read$$iwC.<init>(<console>:60)
  997. at $line34.$read.<init>(<console>:62)
  998. at $line34.$read$.<init>(<console>:66)
  999. at $line34.$read$.<clinit>(<console>)
  1000. at $line34.$eval$.<init>(<console>:7)
  1001. at $line34.$eval$.<clinit>(<console>)
  1002. at $line34.$eval.$print(<console>)
  1003. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  1004. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  1005. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  1006. at java.lang.reflect.Method.invoke(Method.java:497)
  1007. at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
  1008. at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
  1009. at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
  1010. at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
  1011. at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
  1012. at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
  1013. at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
  1014. at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
  1015. at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
  1016. at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
  1017. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
  1018. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:680)
  1019. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:677)
  1020. at scala.reflect.io.Streamable$Chars$class.applyReader(Streamable.scala:104)
  1021. at scala.reflect.io.File.applyReader(File.scala:82)
  1022. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SparkILoop.scala:677)
  1023. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:677)
  1024. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:677)
  1025. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$savingReplayStack(SparkILoop.scala:162)
  1026. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply$mcV$sp(SparkILoop.scala:676)
  1027. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply(SparkILoop.scala:676)
  1028. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply(SparkILoop.scala:676)
  1029. at org.apache.spark.repl.SparkILoop.savingReader(SparkILoop.scala:167)
  1030. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$interpretAllFrom(SparkILoop.scala:675)
  1031. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadCommand$1.apply(SparkILoop.scala:740)
  1032. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadCommand$1.apply(SparkILoop.scala:739)
  1033. at org.apache.spark.repl.SparkILoop.withFile(SparkILoop.scala:733)
  1034. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loadCommand(SparkILoop.scala:739)
  1035. at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$7.apply(SparkILoop.scala:344)
  1036. at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$7.apply(SparkILoop.scala:344)
  1037. at scala.tools.nsc.interpreter.LoopCommands$LineCmd.apply(LoopCommands.scala:81)
  1038. at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:809)
  1039. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadFiles$1.apply(SparkILoop.scala:910)
  1040. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadFiles$1.apply(SparkILoop.scala:908)
  1041. at scala.collection.immutable.List.foreach(List.scala:318)
  1042. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loadFiles(SparkILoop.scala:908)
  1043. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:995)
  1044. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
  1045. at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
  1046. at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
  1047. at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
  1048. at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
  1049. at org.apache.spark.repl.Main$.main(Main.scala:31)
  1050. at org.apache.spark.repl.Main.main(Main.scala)
  1051. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  1052. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  1053. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  1054. at java.lang.reflect.Method.invoke(Method.java:497)
  1055. at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
  1056. at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
  1057. at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
  1058. at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
  1059. at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
  1060. ssc: org.apache.spark.streaming.StreamingContext = org.apache.spark.streaming.StreamingContext@6f6a3391
  1061.  
  1062.  
  1063. scala> Stopping spark context.
  1064. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static/sql,null}
  1065. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/execution/json,null}
  1066. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/execution,null}
  1067. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/json,null}
  1068. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL,null}
  1069. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null}
  1070. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
  1071. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
  1072. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
  1073. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
  1074. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
  1075. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
  1076. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
  1077. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
  1078. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
  1079. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
  1080. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
  1081. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
  1082. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
  1083. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
  1084. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
  1085. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
  1086. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
  1087. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
  1088. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
  1089. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
  1090. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
  1091. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
  1092. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
  1093. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
  1094. 16/04/04 23:22:37 INFO SparkUI: Stopped Spark web UI at http://192.168.62.232:4041
  1095. 16/04/04 23:22:37 INFO YarnClientSchedulerBackend: Shutting down all executors
  1096. 16/04/04 23:22:37 INFO YarnClientSchedulerBackend: Interrupting monitor thread
  1097. 16/04/04 23:22:37 INFO YarnClientSchedulerBackend: Asking each executor to shut down
  1098. 16/04/04 23:22:37 INFO SchedulerExtensionServices: Stopping SchedulerExtensionServices
  1099. (serviceOption=None,
  1100. services=List(),
  1101. started=false)
  1102. 16/04/04 23:22:37 INFO YarnClientSchedulerBackend: Stopped
  1103. 16/04/04 23:22:37 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
  1104. 16/04/04 23:22:37 INFO MemoryStore: MemoryStore cleared
  1105. 16/04/04 23:22:37 INFO BlockManager: BlockManager stopped
  1106. 16/04/04 23:22:37 INFO BlockManagerMaster: BlockManagerMaster stopped
  1107. 16/04/04 23:22:37 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
  1108. 16/04/04 23:22:37 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
  1109. 16/04/04 23:22:37 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
  1110. 16/04/04 23:22:37 INFO SparkContext: Successfully stopped SparkContext
  1111. 16/04/04 23:22:37 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
  1112. 16/04/04 23:22:37 INFO SparkContext: Invoking stop() from shutdown hook
  1113. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null}
  1114. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
  1115. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
  1116. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
  1117. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
  1118. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
  1119. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
  1120. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
  1121. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
  1122. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
  1123. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
  1124. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
  1125. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
  1126. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
  1127. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
  1128. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
  1129. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
  1130. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
  1131. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
  1132. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
  1133. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
  1134. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
  1135. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
  1136. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
  1137. 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
  1138. 16/04/04 23:22:37 INFO SparkUI: Stopped Spark web UI at http://192.168.62.232:4042
  1139. 16/04/04 23:22:37 INFO YarnClientSchedulerBackend: Shutting down all executors
  1140. 16/04/04 23:22:37 INFO YarnClientSchedulerBackend: Interrupting monitor thread
  1141. 16/04/04 23:22:37 INFO YarnClientSchedulerBackend: Asking each executor to shut down
  1142. 16/04/04 23:22:37 INFO SchedulerExtensionServices: Stopping SchedulerExtensionServices
  1143. (serviceOption=None,
  1144. services=List(),
  1145. started=false)
  1146. 16/04/04 23:22:37 ERROR Utils: Uncaught exception in thread Thread-0
  1147. org.apache.spark.SparkException: YarnSparkHadoopUtil is not available in non-YARN mode!
  1148. at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil$.get(YarnSparkHadoopUtil.scala:241)
  1149. at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.stop(YarnClientSchedulerBackend.scala:189)
  1150. at org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:446)
  1151. at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1582)
  1152. at org.apache.spark.SparkContext$$anonfun$stop$7.apply$mcV$sp(SparkContext.scala:1731)
  1153. at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1229)
  1154. at org.apache.spark.SparkContext.stop(SparkContext.scala:1730)
  1155. at org.apache.spark.SparkContext$$anonfun$3.apply$mcV$sp(SparkContext.scala:596)
  1156. at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:267)
  1157. at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:239)
  1158. at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:239)
  1159. at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:239)
  1160. at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1741)
  1161. at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:239)
  1162. at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:239)
  1163. at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:239)
  1164. at scala.util.Try$.apply(Try.scala:161)
  1165. at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:239)
  1166. at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:218)
  1167. at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
  1168. 16/04/04 23:22:37 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
  1169. 16/04/04 23:22:37 INFO MemoryStore: MemoryStore cleared
  1170. 16/04/04 23:22:37 INFO BlockManager: BlockManager stopped
  1171. 16/04/04 23:22:37 INFO BlockManagerMaster: BlockManagerMaster stopped
  1172. 16/04/04 23:22:37 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
  1173. 16/04/04 23:22:37 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
  1174. 16/04/04 23:22:37 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
  1175. 16/04/04 23:22:37 INFO SparkContext: Successfully stopped SparkContext
  1176. 16/04/04 23:22:37 INFO ShutdownHookManager: Shutdown hook called
  1177. 16/04/04 23:22:37 INFO ShutdownHookManager: Deleting directory /tmp/spark-0150e277-9e4d-4428-8edf-31a26ec138cf/httpd-431dcfa2-e53c-4e9c-8177-e4d6c11d4e9d
  1178. 16/04/04 23:22:37 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
  1179. 16/04/04 23:22:37 INFO ShutdownHookManager: Deleting directory /tmp/spark-c20ba538-639e-49f2-a8ce-2257fd1ec2c9
  1180. 16/04/04 23:22:37 INFO ShutdownHookManager: Deleting directory /tmp/spark-0150e277-9e4d-4428-8edf-31a26ec138cf/httpd-cdb4637f-e6f8-4201-99c7-6e3794c3d88d
  1181. 16/04/04 23:22:37 INFO ShutdownHookManager: Deleting directory /tmp/spark-0150e277-9e4d-4428-8edf-31a26ec138cf
  1182. 16/04/04 23:22:37 INFO ShutdownHookManager: Deleting directory /tmp/spark-688f90c6-23e4-4676-a369-4cb1e70725a0
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement