Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- bash# spark-shell -i test.scala --master yarn-client
- Ivy Default Cache set to: /root/.ivy2/cache
- The jars for the packages stored in: /root/.ivy2/jars
- :: loading settings :: url = jar:file:/opt/hosting/run/hdp/2.4.0.0-169/spark/lib/spark-assembly-1.6.0.2.4.0.0-169-hadoop2.7.1.2.4.0.0-169.jar!/org/apache/ivy/core/settings/ivysettings.xml
- org.apache.spark#spark-streaming-kafka_2.10 added as a dependency
- org.elasticsearch#elasticsearch-spark_2.10 added as a dependency
- org.json4s#json4s-native_2.10 added as a dependency
- :: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
- confs: [default]
- found org.apache.spark#spark-streaming-kafka_2.10;1.6.0 in central
- found org.apache.kafka#kafka_2.10;0.8.2.1 in central
- found com.yammer.metrics#metrics-core;2.2.0 in central
- found org.slf4j#slf4j-api;1.7.10 in central
- found org.apache.kafka#kafka-clients;0.8.2.1 in central
- found net.jpountz.lz4#lz4;1.3.0 in central
- found org.xerial.snappy#snappy-java;1.1.2 in central
- found com.101tec#zkclient;0.3 in central
- found log4j#log4j;1.2.17 in central
- found org.spark-project.spark#unused;1.0.0 in central
- found org.elasticsearch#elasticsearch-spark_2.10;2.2.0 in central
- found org.json4s#json4s-native_2.10;3.2.11 in central
- found org.json4s#json4s-core_2.10;3.2.11 in central
- found org.json4s#json4s-ast_2.10;3.2.11 in central
- found com.thoughtworks.paranamer#paranamer;2.6 in list
- found org.scala-lang#scalap;2.10.0 in central
- found org.scala-lang#scala-compiler;2.10.0 in central
- found org.scala-lang#scala-reflect;2.10.0 in central
- :: resolution report :: resolve 514ms :: artifacts dl 14ms
- :: modules in use:
- com.101tec#zkclient;0.3 from central in [default]
- com.thoughtworks.paranamer#paranamer;2.6 from list in [default]
- com.yammer.metrics#metrics-core;2.2.0 from central in [default]
- log4j#log4j;1.2.17 from central in [default]
- net.jpountz.lz4#lz4;1.3.0 from central in [default]
- org.apache.kafka#kafka-clients;0.8.2.1 from central in [default]
- org.apache.kafka#kafka_2.10;0.8.2.1 from central in [default]
- org.apache.spark#spark-streaming-kafka_2.10;1.6.0 from central in [default]
- org.elasticsearch#elasticsearch-spark_2.10;2.2.0 from central in [default]
- org.json4s#json4s-ast_2.10;3.2.11 from central in [default]
- org.json4s#json4s-core_2.10;3.2.11 from central in [default]
- org.json4s#json4s-native_2.10;3.2.11 from central in [default]
- org.scala-lang#scala-compiler;2.10.0 from central in [default]
- org.scala-lang#scala-reflect;2.10.0 from central in [default]
- org.scala-lang#scalap;2.10.0 from central in [default]
- org.slf4j#slf4j-api;1.7.10 from central in [default]
- org.spark-project.spark#unused;1.0.0 from central in [default]
- org.xerial.snappy#snappy-java;1.1.2 from central in [default]
- ---------------------------------------------------------------------
- | | modules || artifacts |
- | conf | number| search|dwnlded|evicted|| number|dwnlded|
- ---------------------------------------------------------------------
- | default | 18 | 0 | 0 | 0 || 18 | 0 |
- ---------------------------------------------------------------------
- :: retrieving :: org.apache.spark#spark-submit-parent
- confs: [default]
- 0 artifacts copied, 18 already retrieved (0kB/13ms)
- 16/04/04 23:21:55 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
- 16/04/04 23:21:55 INFO SecurityManager: Changing view acls to: root
- 16/04/04 23:21:55 INFO SecurityManager: Changing modify acls to: root
- 16/04/04 23:21:55 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
- 16/04/04 23:21:55 INFO HttpServer: Starting HTTP Server
- 16/04/04 23:21:55 INFO Server: jetty-8.y.z-SNAPSHOT
- 16/04/04 23:21:55 INFO AbstractConnector: Started [email protected]:45070
- 16/04/04 23:21:55 INFO Utils: Successfully started service 'HTTP class server' on port 45070.
- Welcome to
- ____ __
- / __/__ ___ _____/ /__
- _\ \/ _ \/ _ `/ __/ '_/
- /___/ .__/\_,_/_/ /_/\_\ version 1.6.0
- /_/
- Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_60)
- Type in expressions to have them evaluated.
- Type :help for more information.
- 16/04/04 23:21:58 INFO SparkContext: Running Spark version 1.6.0
- 16/04/04 23:21:58 INFO SecurityManager: Changing view acls to: root
- 16/04/04 23:21:58 INFO SecurityManager: Changing modify acls to: root
- 16/04/04 23:21:58 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
- 16/04/04 23:21:58 INFO Utils: Successfully started service 'sparkDriver' on port 40352.
- 16/04/04 23:21:58 INFO Slf4jLogger: Slf4jLogger started
- 16/04/04 23:21:58 INFO Remoting: Starting remoting
- 16/04/04 23:21:58 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:58038]
- 16/04/04 23:21:58 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 58038.
- 16/04/04 23:21:58 INFO SparkEnv: Registering MapOutputTracker
- 16/04/04 23:21:58 INFO SparkEnv: Registering BlockManagerMaster
- 16/04/04 23:21:58 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-d434d784-9295-44b1-a875-a53269c383fc
- 16/04/04 23:21:58 INFO MemoryStore: MemoryStore started with capacity 511.1 MB
- 16/04/04 23:21:58 INFO SparkEnv: Registering OutputCommitCoordinator
- 16/04/04 23:21:58 INFO Server: jetty-8.y.z-SNAPSHOT
- 16/04/04 23:21:58 WARN AbstractLifeCycle: FAILED [email protected]:4040: java.net.BindException: Address already in use
- java.net.BindException: Address already in use
- at sun.nio.ch.Net.bind0(Native Method)
- at sun.nio.ch.Net.bind(Net.java:433)
- at sun.nio.ch.Net.bind(Net.java:425)
- at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
- at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
- at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
- at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
- at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
- at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
- at org.spark-project.jetty.server.Server.doStart(Server.java:293)
- at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
- at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:252)
- at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
- at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
- at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1964)
- at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
- at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1955)
- at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:262)
- at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
- at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
- at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
- at scala.Option.foreach(Option.scala:236)
- at org.apache.spark.SparkContext.<init>(SparkContext.scala:481)
- at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
- at $line3.$read$$iwC$$iwC.<init>(<console>:15)
- at $line3.$read$$iwC.<init>(<console>:24)
- at $line3.$read.<init>(<console>:26)
- at $line3.$read$.<init>(<console>:30)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.<init>(<console>:7)
- at $line3.$eval$.<clinit>(<console>)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:497)
- at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
- at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
- at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
- at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
- at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
- at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
- at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
- at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
- at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
- at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
- at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
- at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
- at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
- at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
- at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
- at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
- at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
- at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
- at org.apache.spark.repl.Main$.main(Main.scala:31)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:497)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- 16/04/04 23:21:58 WARN AbstractLifeCycle: FAILED org.spark-project.jetty.server.Server@83e635f: java.net.BindException: Address already in use
- java.net.BindException: Address already in use
- at sun.nio.ch.Net.bind0(Native Method)
- at sun.nio.ch.Net.bind(Net.java:433)
- at sun.nio.ch.Net.bind(Net.java:425)
- at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
- at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
- at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
- at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
- at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
- at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
- at org.spark-project.jetty.server.Server.doStart(Server.java:293)
- at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
- at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:252)
- at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
- at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
- at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1964)
- at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
- at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1955)
- at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:262)
- at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
- at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
- at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
- at scala.Option.foreach(Option.scala:236)
- at org.apache.spark.SparkContext.<init>(SparkContext.scala:481)
- at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
- at $line3.$read$$iwC$$iwC.<init>(<console>:15)
- at $line3.$read$$iwC.<init>(<console>:24)
- at $line3.$read.<init>(<console>:26)
- at $line3.$read$.<init>(<console>:30)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.<init>(<console>:7)
- at $line3.$eval$.<clinit>(<console>)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:497)
- at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
- at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
- at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
- at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
- at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
- at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
- at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
- at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
- at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125)
- at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
- at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
- at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
- at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
- at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
- at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
- at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
- at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
- at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
- at org.apache.spark.repl.Main$.main(Main.scala:31)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:497)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
- 16/04/04 23:21:58 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
- 16/04/04 23:21:58 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
- 16/04/04 23:21:58 INFO Server: jetty-8.y.z-SNAPSHOT
- 16/04/04 23:21:58 INFO AbstractConnector: Started [email protected]:4041
- 16/04/04 23:21:58 INFO Utils: Successfully started service 'SparkUI' on port 4041.
- 16/04/04 23:21:58 INFO SparkUI: Started SparkUI at http://192.168.62.232:4041
- 16/04/04 23:21:58 INFO HttpFileServer: HTTP File server directory is /tmp/spark-0150e277-9e4d-4428-8edf-31a26ec138cf/httpd-cdb4637f-e6f8-4201-99c7-6e3794c3d88d
- 16/04/04 23:21:58 INFO HttpServer: Starting HTTP Server
- 16/04/04 23:21:58 INFO Server: jetty-8.y.z-SNAPSHOT
- 16/04/04 23:21:58 INFO AbstractConnector: Started [email protected]:54109
- 16/04/04 23:21:58 INFO Utils: Successfully started service 'HTTP file server' on port 54109.
- 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.apache.spark_spark-streaming-kafka_2.10-1.6.0.jar at http://192.168.62.232:54109/jars/org.apache.spark_spark-streaming-kafka_2.10-1.6.0.jar with timestamp 1459804918951
- 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.elasticsearch_elasticsearch-spark_2.10-2.2.0.jar at http://192.168.62.232:54109/jars/org.elasticsearch_elasticsearch-spark_2.10-2.2.0.jar with timestamp 1459804918952
- 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.json4s_json4s-native_2.10-3.2.11.jar at http://192.168.62.232:54109/jars/org.json4s_json4s-native_2.10-3.2.11.jar with timestamp 1459804918953
- 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.apache.kafka_kafka_2.10-0.8.2.1.jar at http://192.168.62.232:54109/jars/org.apache.kafka_kafka_2.10-0.8.2.1.jar with timestamp 1459804918962
- 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.spark-project.spark_unused-1.0.0.jar at http://192.168.62.232:54109/jars/org.spark-project.spark_unused-1.0.0.jar with timestamp 1459804918962
- 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/com.yammer.metrics_metrics-core-2.2.0.jar at http://192.168.62.232:54109/jars/com.yammer.metrics_metrics-core-2.2.0.jar with timestamp 1459804918962
- 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.apache.kafka_kafka-clients-0.8.2.1.jar at http://192.168.62.232:54109/jars/org.apache.kafka_kafka-clients-0.8.2.1.jar with timestamp 1459804918963
- 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/com.101tec_zkclient-0.3.jar at http://192.168.62.232:54109/jars/com.101tec_zkclient-0.3.jar with timestamp 1459804918964
- 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.slf4j_slf4j-api-1.7.10.jar at http://192.168.62.232:54109/jars/org.slf4j_slf4j-api-1.7.10.jar with timestamp 1459804918964
- 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/net.jpountz.lz4_lz4-1.3.0.jar at http://192.168.62.232:54109/jars/net.jpountz.lz4_lz4-1.3.0.jar with timestamp 1459804918965
- 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.xerial.snappy_snappy-java-1.1.2.jar at http://192.168.62.232:54109/jars/org.xerial.snappy_snappy-java-1.1.2.jar with timestamp 1459804918966
- 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/log4j_log4j-1.2.17.jar at http://192.168.62.232:54109/jars/log4j_log4j-1.2.17.jar with timestamp 1459804918968
- 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.json4s_json4s-core_2.10-3.2.11.jar at http://192.168.62.232:54109/jars/org.json4s_json4s-core_2.10-3.2.11.jar with timestamp 1459804918969
- 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.json4s_json4s-ast_2.10-3.2.11.jar at http://192.168.62.232:54109/jars/org.json4s_json4s-ast_2.10-3.2.11.jar with timestamp 1459804918970
- 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/com.thoughtworks.paranamer_paranamer-2.6.jar at http://192.168.62.232:54109/jars/com.thoughtworks.paranamer_paranamer-2.6.jar with timestamp 1459804918970
- 16/04/04 23:21:58 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.scala-lang_scalap-2.10.0.jar at http://192.168.62.232:54109/jars/org.scala-lang_scalap-2.10.0.jar with timestamp 1459804918972
- 16/04/04 23:21:59 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.scala-lang_scala-compiler-2.10.0.jar at http://192.168.62.232:54109/jars/org.scala-lang_scala-compiler-2.10.0.jar with timestamp 1459804919003
- 16/04/04 23:21:59 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.scala-lang_scala-reflect-2.10.0.jar at http://192.168.62.232:54109/jars/org.scala-lang_scala-reflect-2.10.0.jar with timestamp 1459804919010
- spark.yarn.driver.memoryOverhead is set but does not apply in client mode.
- 16/04/04 23:21:59 INFO TimelineClientImpl: Timeline service address: http://compute01.mydomain:8188/ws/v1/timeline/
- 16/04/04 23:21:59 INFO RMProxy: Connecting to ResourceManager at compute01.mydomain/192.168.62.232:8050
- 16/04/04 23:22:00 WARN DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
- 16/04/04 23:22:00 INFO Client: Requesting a new application from cluster with 5 NodeManagers
- 16/04/04 23:22:00 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (10240 MB per container)
- 16/04/04 23:22:00 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
- 16/04/04 23:22:00 INFO Client: Setting up container launch context for our AM
- 16/04/04 23:22:00 INFO Client: Setting up the launch environment for our AM container
- 16/04/04 23:22:00 WARN Client: No spark assembly jar for HDP on HDFS, defaultSparkAssembly:hdfs://compute01.mydomain:8020/hdp/apps/2.4.0.0-169/spark/spark-hdp-assembly.jar
- 16/04/04 23:22:00 INFO Client: Preparing resources for our AM container
- 16/04/04 23:22:00 WARN Client: No spark assembly jar for HDP on HDFS, defaultSparkAssembly:hdfs://compute01.mydomain:8020/hdp/apps/2.4.0.0-169/spark/spark-hdp-assembly.jar
- 16/04/04 23:22:00 INFO Client: Uploading resource file:/opt/hosting/run/hdp/2.4.0.0-169/spark/lib/spark-assembly-1.6.0.2.4.0.0-169-hadoop2.7.1.2.4.0.0-169.jar -> hdfs://compute01.mydomain:8020/user/root/.sparkStaging/application_1459404819578_0101/spark-assembly-1.6.0.2.4.0.0-169-hadoop2.7.1.2.4.0.0-169.jar
- 16/04/04 23:22:02 INFO Client: Uploading resource file:/tmp/spark-0150e277-9e4d-4428-8edf-31a26ec138cf/__spark_conf__8928795395569657698.zip -> hdfs://compute01.mydomain:8020/user/root/.sparkStaging/application_1459404819578_0101/__spark_conf__8928795395569657698.zip
- 16/04/04 23:22:02 INFO SecurityManager: Changing view acls to: root
- 16/04/04 23:22:02 INFO SecurityManager: Changing modify acls to: root
- 16/04/04 23:22:02 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
- 16/04/04 23:22:02 INFO Client: Submitting application 101 to ResourceManager
- 16/04/04 23:22:02 INFO YarnClientImpl: Submitted application application_1459404819578_0101
- 16/04/04 23:22:02 INFO SchedulerExtensionServices: Starting Yarn extension services with app application_1459404819578_0101 and attemptId None
- 16/04/04 23:22:03 INFO Client: Application report for application_1459404819578_0101 (state: ACCEPTED)
- 16/04/04 23:22:03 INFO Client:
- client token: N/A
- diagnostics: N/A
- ApplicationMaster host: N/A
- ApplicationMaster RPC port: -1
- queue: default
- start time: 1459804922514
- final status: UNDEFINED
- tracking URL: http://compute01.mydomain:8088/proxy/application_1459404819578_0101/
- user: root
- 16/04/04 23:22:04 INFO Client: Application report for application_1459404819578_0101 (state: ACCEPTED)
- 16/04/04 23:22:05 INFO Client: Application report for application_1459404819578_0101 (state: ACCEPTED)
- 16/04/04 23:22:06 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(null)
- 16/04/04 23:22:06 INFO YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> compute01.mydomain, PROXY_URI_BASES -> http://compute01.mydomain:8088/proxy/application_1459404819578_0101), /proxy/application_1459404819578_0101
- 16/04/04 23:22:06 INFO JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
- 16/04/04 23:22:06 INFO Client: Application report for application_1459404819578_0101 (state: RUNNING)
- 16/04/04 23:22:06 INFO Client:
- client token: N/A
- diagnostics: N/A
- ApplicationMaster host: 192.168.62.216
- ApplicationMaster RPC port: 0
- queue: default
- start time: 1459804922514
- final status: UNDEFINED
- tracking URL: http://compute01.mydomain:8088/proxy/application_1459404819578_0101/
- user: root
- 16/04/04 23:22:06 INFO YarnClientSchedulerBackend: Application application_1459404819578_0101 has started running.
- 16/04/04 23:22:06 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 35299.
- 16/04/04 23:22:06 INFO NettyBlockTransferService: Server created on 35299
- 16/04/04 23:22:06 INFO BlockManagerMaster: Trying to register BlockManager
- 16/04/04 23:22:06 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.62.232:35299 with 511.1 MB RAM, BlockManagerId(driver, 192.168.62.232, 35299)
- 16/04/04 23:22:06 INFO BlockManagerMaster: Registered BlockManager
- 16/04/04 23:22:06 INFO EventLoggingListener: Logging events to hdfs:///spark-history/application_1459404819578_0101
- 16/04/04 23:22:09 INFO YarnClientSchedulerBackend: Registered executor NettyRpcEndpointRef(null) (compute02.mydomain:38367) with ID 1
- 16/04/04 23:22:09 INFO BlockManagerMasterEndpoint: Registering block manager compute02.mydomain:54410 with 511.1 MB RAM, BlockManagerId(1, compute02.mydomain, 54410)
- 16/04/04 23:22:10 INFO YarnClientSchedulerBackend: Registered executor NettyRpcEndpointRef(null) (compute04.mydomain:47907) with ID 2
- 16/04/04 23:22:10 INFO BlockManagerMasterEndpoint: Registering block manager compute04.mydomain:49073 with 511.1 MB RAM, BlockManagerId(2, compute04.mydomain, 49073)
- 16/04/04 23:22:10 INFO YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
- 16/04/04 23:22:10 INFO SparkILoop: Created spark context..
- Spark context available as sc.
- 16/04/04 23:22:11 INFO HiveContext: Initializing execution hive, version 1.2.1
- 16/04/04 23:22:11 INFO ClientWrapper: Inspected Hadoop version: 2.7.1.2.4.0.0-169
- 16/04/04 23:22:11 INFO ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.7.1.2.4.0.0-169
- 16/04/04 23:22:11 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
- 16/04/04 23:22:11 INFO ObjectStore: ObjectStore, initialize called
- 16/04/04 23:22:11 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
- 16/04/04 23:22:11 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored
- 16/04/04 23:22:11 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
- 16/04/04 23:22:12 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
- 16/04/04 23:22:12 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
- 16/04/04 23:22:13 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
- 16/04/04 23:22:13 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
- 16/04/04 23:22:14 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
- 16/04/04 23:22:14 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
- 16/04/04 23:22:14 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY
- 16/04/04 23:22:14 INFO ObjectStore: Initialized ObjectStore
- 16/04/04 23:22:14 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
- 16/04/04 23:22:14 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
- 16/04/04 23:22:14 INFO HiveMetaStore: Added admin role in metastore
- 16/04/04 23:22:14 INFO HiveMetaStore: Added public role in metastore
- 16/04/04 23:22:14 INFO HiveMetaStore: No user is added in admin role, since config is empty
- 16/04/04 23:22:14 INFO HiveMetaStore: 0: get_all_databases
- 16/04/04 23:22:14 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_all_databases
- 16/04/04 23:22:14 INFO HiveMetaStore: 0: get_functions: db=default pat=*
- 16/04/04 23:22:14 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_functions: db=default pat=*
- 16/04/04 23:22:14 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as "embedded-only" so does not have its own datastore table.
- 16/04/04 23:22:14 INFO SessionState: Created local directory: /tmp/8a1c78d8-7182-4b8d-a17b-a1193517d5ff_resources
- 16/04/04 23:22:14 INFO SessionState: Created HDFS directory: /tmp/hive/root/8a1c78d8-7182-4b8d-a17b-a1193517d5ff
- 16/04/04 23:22:14 INFO SessionState: Created local directory: /tmp/root/8a1c78d8-7182-4b8d-a17b-a1193517d5ff
- 16/04/04 23:22:14 INFO SessionState: Created HDFS directory: /tmp/hive/root/8a1c78d8-7182-4b8d-a17b-a1193517d5ff/_tmp_space.db
- 16/04/04 23:22:15 INFO HiveContext: default warehouse location is /user/hive/warehouse
- 16/04/04 23:22:15 INFO HiveContext: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
- 16/04/04 23:22:15 INFO ClientWrapper: Inspected Hadoop version: 2.7.1.2.4.0.0-169
- 16/04/04 23:22:15 INFO ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.7.1.2.4.0.0-169
- 16/04/04 23:22:15 INFO metastore: Trying to connect to metastore with URI thrift://compute01.mydomain:9083
- 16/04/04 23:22:15 INFO metastore: Connected to metastore.
- 16/04/04 23:22:15 INFO SessionState: Created local directory: /tmp/b9e8d651-f8f5-4fc6-a3e3-b63c54be5f04_resources
- 16/04/04 23:22:15 INFO SessionState: Created HDFS directory: /tmp/hive/root/b9e8d651-f8f5-4fc6-a3e3-b63c54be5f04
- 16/04/04 23:22:15 INFO SessionState: Created local directory: /tmp/root/b9e8d651-f8f5-4fc6-a3e3-b63c54be5f04
- 16/04/04 23:22:15 INFO SessionState: Created HDFS directory: /tmp/hive/root/b9e8d651-f8f5-4fc6-a3e3-b63c54be5f04/_tmp_space.db
- 16/04/04 23:22:15 INFO SparkILoop: Created sql context (with Hive support)..
- SQL context available as sqlContext.
- Loading test.scala...
- import org.apache.spark._
- import org.apache.spark.streaming._
- app: String = test-scala
- conf: org.apache.spark.SparkConf = org.apache.spark.SparkConf@59fe8d94
- 16/04/04 23:22:17 INFO SparkContext: Running Spark version 1.6.0
- 16/04/04 23:22:17 INFO SecurityManager: Changing view acls to: root
- 16/04/04 23:22:17 INFO SecurityManager: Changing modify acls to: root
- 16/04/04 23:22:17 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
- 16/04/04 23:22:17 INFO Utils: Successfully started service 'sparkDriver' on port 60455.
- 16/04/04 23:22:17 INFO Slf4jLogger: Slf4jLogger started
- 16/04/04 23:22:17 INFO Remoting: Starting remoting
- 16/04/04 23:22:17 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:38419]
- 16/04/04 23:22:17 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 38419.
- 16/04/04 23:22:17 INFO SparkEnv: Registering MapOutputTracker
- 16/04/04 23:22:17 INFO SparkEnv: Registering BlockManagerMaster
- 16/04/04 23:22:17 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-2a145cf9-d3f7-4aa7-a1b4-1c9544ae89b6
- 16/04/04 23:22:17 INFO MemoryStore: MemoryStore started with capacity 458.6 MB
- 16/04/04 23:22:17 INFO SparkEnv: Registering OutputCommitCoordinator
- 16/04/04 23:22:17 INFO Server: jetty-8.y.z-SNAPSHOT
- 16/04/04 23:22:17 WARN AbstractLifeCycle: FAILED [email protected]:4040: java.net.BindException: Address already in use
- java.net.BindException: Address already in use
- at sun.nio.ch.Net.bind0(Native Method)
- at sun.nio.ch.Net.bind(Net.java:433)
- at sun.nio.ch.Net.bind(Net.java:425)
- at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
- at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
- at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
- at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
- at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
- at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
- at org.spark-project.jetty.server.Server.doStart(Server.java:293)
- at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
- at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:252)
- at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
- at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
- at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1964)
- at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
- at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1955)
- at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:262)
- at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
- at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
- at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
- at scala.Option.foreach(Option.scala:236)
- at org.apache.spark.SparkContext.<init>(SparkContext.scala:481)
- at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
- at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:50)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:52)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:54)
- at $line34.$read$$iwC$$iwC$$iwC.<init>(<console>:56)
- at $line34.$read$$iwC$$iwC.<init>(<console>:58)
- at $line34.$read$$iwC.<init>(<console>:60)
- at $line34.$read.<init>(<console>:62)
- at $line34.$read$.<init>(<console>:66)
- at $line34.$read$.<clinit>(<console>)
- at $line34.$eval$.<init>(<console>:7)
- at $line34.$eval$.<clinit>(<console>)
- at $line34.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:497)
- at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
- at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
- at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
- at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
- at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
- at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
- at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
- at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
- at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
- at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:680)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:677)
- at scala.reflect.io.Streamable$Chars$class.applyReader(Streamable.scala:104)
- at scala.reflect.io.File.applyReader(File.scala:82)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SparkILoop.scala:677)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:677)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:677)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$savingReplayStack(SparkILoop.scala:162)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply$mcV$sp(SparkILoop.scala:676)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply(SparkILoop.scala:676)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply(SparkILoop.scala:676)
- at org.apache.spark.repl.SparkILoop.savingReader(SparkILoop.scala:167)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$interpretAllFrom(SparkILoop.scala:675)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadCommand$1.apply(SparkILoop.scala:740)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadCommand$1.apply(SparkILoop.scala:739)
- at org.apache.spark.repl.SparkILoop.withFile(SparkILoop.scala:733)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loadCommand(SparkILoop.scala:739)
- at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$7.apply(SparkILoop.scala:344)
- at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$7.apply(SparkILoop.scala:344)
- at scala.tools.nsc.interpreter.LoopCommands$LineCmd.apply(LoopCommands.scala:81)
- at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:809)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadFiles$1.apply(SparkILoop.scala:910)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadFiles$1.apply(SparkILoop.scala:908)
- at scala.collection.immutable.List.foreach(List.scala:318)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loadFiles(SparkILoop.scala:908)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:995)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
- at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
- at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
- at org.apache.spark.repl.Main$.main(Main.scala:31)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:497)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- 16/04/04 23:22:17 WARN AbstractLifeCycle: FAILED org.spark-project.jetty.server.Server@196e494d: java.net.BindException: Address already in use
- java.net.BindException: Address already in use
- at sun.nio.ch.Net.bind0(Native Method)
- at sun.nio.ch.Net.bind(Net.java:433)
- at sun.nio.ch.Net.bind(Net.java:425)
- at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
- at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
- at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
- at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
- at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
- at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
- at org.spark-project.jetty.server.Server.doStart(Server.java:293)
- at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
- at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:252)
- at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
- at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
- at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1964)
- at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
- at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1955)
- at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:262)
- at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
- at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
- at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
- at scala.Option.foreach(Option.scala:236)
- at org.apache.spark.SparkContext.<init>(SparkContext.scala:481)
- at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
- at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:50)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:52)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:54)
- at $line34.$read$$iwC$$iwC$$iwC.<init>(<console>:56)
- at $line34.$read$$iwC$$iwC.<init>(<console>:58)
- at $line34.$read$$iwC.<init>(<console>:60)
- at $line34.$read.<init>(<console>:62)
- at $line34.$read$.<init>(<console>:66)
- at $line34.$read$.<clinit>(<console>)
- at $line34.$eval$.<init>(<console>:7)
- at $line34.$eval$.<clinit>(<console>)
- at $line34.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:497)
- at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
- at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
- at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
- at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
- at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
- at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
- at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
- at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
- at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
- at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:680)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:677)
- at scala.reflect.io.Streamable$Chars$class.applyReader(Streamable.scala:104)
- at scala.reflect.io.File.applyReader(File.scala:82)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SparkILoop.scala:677)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:677)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:677)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$savingReplayStack(SparkILoop.scala:162)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply$mcV$sp(SparkILoop.scala:676)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply(SparkILoop.scala:676)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply(SparkILoop.scala:676)
- at org.apache.spark.repl.SparkILoop.savingReader(SparkILoop.scala:167)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$interpretAllFrom(SparkILoop.scala:675)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadCommand$1.apply(SparkILoop.scala:740)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadCommand$1.apply(SparkILoop.scala:739)
- at org.apache.spark.repl.SparkILoop.withFile(SparkILoop.scala:733)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loadCommand(SparkILoop.scala:739)
- at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$7.apply(SparkILoop.scala:344)
- at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$7.apply(SparkILoop.scala:344)
- at scala.tools.nsc.interpreter.LoopCommands$LineCmd.apply(LoopCommands.scala:81)
- at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:809)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadFiles$1.apply(SparkILoop.scala:910)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadFiles$1.apply(SparkILoop.scala:908)
- at scala.collection.immutable.List.foreach(List.scala:318)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loadFiles(SparkILoop.scala:908)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:995)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
- at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
- at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
- at org.apache.spark.repl.Main$.main(Main.scala:31)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:497)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
- 16/04/04 23:22:17 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
- 16/04/04 23:22:17 INFO Server: jetty-8.y.z-SNAPSHOT
- 16/04/04 23:22:17 WARN AbstractLifeCycle: FAILED [email protected]:4041: java.net.BindException: Address already in use
- java.net.BindException: Address already in use
- at sun.nio.ch.Net.bind0(Native Method)
- at sun.nio.ch.Net.bind(Net.java:433)
- at sun.nio.ch.Net.bind(Net.java:425)
- at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
- at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
- at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
- at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
- at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
- at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
- at org.spark-project.jetty.server.Server.doStart(Server.java:293)
- at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
- at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:252)
- at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
- at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
- at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1964)
- at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
- at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1955)
- at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:262)
- at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
- at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
- at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
- at scala.Option.foreach(Option.scala:236)
- at org.apache.spark.SparkContext.<init>(SparkContext.scala:481)
- at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
- at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:50)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:52)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:54)
- at $line34.$read$$iwC$$iwC$$iwC.<init>(<console>:56)
- at $line34.$read$$iwC$$iwC.<init>(<console>:58)
- at $line34.$read$$iwC.<init>(<console>:60)
- at $line34.$read.<init>(<console>:62)
- at $line34.$read$.<init>(<console>:66)
- at $line34.$read$.<clinit>(<console>)
- at $line34.$eval$.<init>(<console>:7)
- at $line34.$eval$.<clinit>(<console>)
- at $line34.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:497)
- at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
- at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
- at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
- at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
- at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
- at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
- at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
- at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
- at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
- at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:680)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:677)
- at scala.reflect.io.Streamable$Chars$class.applyReader(Streamable.scala:104)
- at scala.reflect.io.File.applyReader(File.scala:82)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SparkILoop.scala:677)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:677)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:677)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$savingReplayStack(SparkILoop.scala:162)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply$mcV$sp(SparkILoop.scala:676)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply(SparkILoop.scala:676)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply(SparkILoop.scala:676)
- at org.apache.spark.repl.SparkILoop.savingReader(SparkILoop.scala:167)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$interpretAllFrom(SparkILoop.scala:675)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadCommand$1.apply(SparkILoop.scala:740)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadCommand$1.apply(SparkILoop.scala:739)
- at org.apache.spark.repl.SparkILoop.withFile(SparkILoop.scala:733)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loadCommand(SparkILoop.scala:739)
- at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$7.apply(SparkILoop.scala:344)
- at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$7.apply(SparkILoop.scala:344)
- at scala.tools.nsc.interpreter.LoopCommands$LineCmd.apply(LoopCommands.scala:81)
- at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:809)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadFiles$1.apply(SparkILoop.scala:910)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadFiles$1.apply(SparkILoop.scala:908)
- at scala.collection.immutable.List.foreach(List.scala:318)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loadFiles(SparkILoop.scala:908)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:995)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
- at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
- at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
- at org.apache.spark.repl.Main$.main(Main.scala:31)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:497)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- 16/04/04 23:22:17 WARN AbstractLifeCycle: FAILED org.spark-project.jetty.server.Server@58c7bb01: java.net.BindException: Address already in use
- java.net.BindException: Address already in use
- at sun.nio.ch.Net.bind0(Native Method)
- at sun.nio.ch.Net.bind(Net.java:433)
- at sun.nio.ch.Net.bind(Net.java:425)
- at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
- at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
- at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
- at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
- at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
- at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
- at org.spark-project.jetty.server.Server.doStart(Server.java:293)
- at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
- at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:252)
- at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
- at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
- at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1964)
- at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
- at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1955)
- at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:262)
- at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
- at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
- at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
- at scala.Option.foreach(Option.scala:236)
- at org.apache.spark.SparkContext.<init>(SparkContext.scala:481)
- at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
- at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:50)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:52)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:54)
- at $line34.$read$$iwC$$iwC$$iwC.<init>(<console>:56)
- at $line34.$read$$iwC$$iwC.<init>(<console>:58)
- at $line34.$read$$iwC.<init>(<console>:60)
- at $line34.$read.<init>(<console>:62)
- at $line34.$read$.<init>(<console>:66)
- at $line34.$read$.<clinit>(<console>)
- at $line34.$eval$.<init>(<console>:7)
- at $line34.$eval$.<clinit>(<console>)
- at $line34.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:497)
- at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
- at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
- at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
- at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
- at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
- at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
- at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
- at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
- at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
- at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:680)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:677)
- at scala.reflect.io.Streamable$Chars$class.applyReader(Streamable.scala:104)
- at scala.reflect.io.File.applyReader(File.scala:82)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SparkILoop.scala:677)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:677)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:677)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$savingReplayStack(SparkILoop.scala:162)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply$mcV$sp(SparkILoop.scala:676)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply(SparkILoop.scala:676)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply(SparkILoop.scala:676)
- at org.apache.spark.repl.SparkILoop.savingReader(SparkILoop.scala:167)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$interpretAllFrom(SparkILoop.scala:675)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadCommand$1.apply(SparkILoop.scala:740)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadCommand$1.apply(SparkILoop.scala:739)
- at org.apache.spark.repl.SparkILoop.withFile(SparkILoop.scala:733)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loadCommand(SparkILoop.scala:739)
- at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$7.apply(SparkILoop.scala:344)
- at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$7.apply(SparkILoop.scala:344)
- at scala.tools.nsc.interpreter.LoopCommands$LineCmd.apply(LoopCommands.scala:81)
- at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:809)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadFiles$1.apply(SparkILoop.scala:910)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadFiles$1.apply(SparkILoop.scala:908)
- at scala.collection.immutable.List.foreach(List.scala:318)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loadFiles(SparkILoop.scala:908)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:995)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
- at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
- at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
- at org.apache.spark.repl.Main$.main(Main.scala:31)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:497)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
- 16/04/04 23:22:17 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
- 16/04/04 23:22:17 WARN Utils: Service 'SparkUI' could not bind on port 4041. Attempting port 4042.
- 16/04/04 23:22:17 INFO Server: jetty-8.y.z-SNAPSHOT
- 16/04/04 23:22:17 INFO AbstractConnector: Started [email protected]:4042
- 16/04/04 23:22:17 INFO Utils: Successfully started service 'SparkUI' on port 4042.
- 16/04/04 23:22:17 INFO SparkUI: Started SparkUI at http://192.168.62.232:4042
- 16/04/04 23:22:17 INFO HttpFileServer: HTTP File server directory is /tmp/spark-0150e277-9e4d-4428-8edf-31a26ec138cf/httpd-431dcfa2-e53c-4e9c-8177-e4d6c11d4e9d
- 16/04/04 23:22:17 INFO HttpServer: Starting HTTP Server
- 16/04/04 23:22:17 INFO Server: jetty-8.y.z-SNAPSHOT
- 16/04/04 23:22:17 INFO AbstractConnector: Started [email protected]:53366
- 16/04/04 23:22:17 INFO Utils: Successfully started service 'HTTP file server' on port 53366.
- 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.apache.spark_spark-streaming-kafka_2.10-1.6.0.jar at http://192.168.62.232:53366/jars/org.apache.spark_spark-streaming-kafka_2.10-1.6.0.jar with timestamp 1459804937765
- 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.elasticsearch_elasticsearch-spark_2.10-2.2.0.jar at http://192.168.62.232:53366/jars/org.elasticsearch_elasticsearch-spark_2.10-2.2.0.jar with timestamp 1459804937766
- 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.json4s_json4s-native_2.10-3.2.11.jar at http://192.168.62.232:53366/jars/org.json4s_json4s-native_2.10-3.2.11.jar with timestamp 1459804937766
- 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.apache.kafka_kafka_2.10-0.8.2.1.jar at http://192.168.62.232:53366/jars/org.apache.kafka_kafka_2.10-0.8.2.1.jar with timestamp 1459804937774
- 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.spark-project.spark_unused-1.0.0.jar at http://192.168.62.232:53366/jars/org.spark-project.spark_unused-1.0.0.jar with timestamp 1459804937775
- 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/com.yammer.metrics_metrics-core-2.2.0.jar at http://192.168.62.232:53366/jars/com.yammer.metrics_metrics-core-2.2.0.jar with timestamp 1459804937775
- 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.apache.kafka_kafka-clients-0.8.2.1.jar at http://192.168.62.232:53366/jars/org.apache.kafka_kafka-clients-0.8.2.1.jar with timestamp 1459804937776
- 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/com.101tec_zkclient-0.3.jar at http://192.168.62.232:53366/jars/com.101tec_zkclient-0.3.jar with timestamp 1459804937776
- 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.slf4j_slf4j-api-1.7.10.jar at http://192.168.62.232:53366/jars/org.slf4j_slf4j-api-1.7.10.jar with timestamp 1459804937776
- 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/net.jpountz.lz4_lz4-1.3.0.jar at http://192.168.62.232:53366/jars/net.jpountz.lz4_lz4-1.3.0.jar with timestamp 1459804937777
- 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.xerial.snappy_snappy-java-1.1.2.jar at http://192.168.62.232:53366/jars/org.xerial.snappy_snappy-java-1.1.2.jar with timestamp 1459804937779
- 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/log4j_log4j-1.2.17.jar at http://192.168.62.232:53366/jars/log4j_log4j-1.2.17.jar with timestamp 1459804937780
- 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.json4s_json4s-core_2.10-3.2.11.jar at http://192.168.62.232:53366/jars/org.json4s_json4s-core_2.10-3.2.11.jar with timestamp 1459804937781
- 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.json4s_json4s-ast_2.10-3.2.11.jar at http://192.168.62.232:53366/jars/org.json4s_json4s-ast_2.10-3.2.11.jar with timestamp 1459804937781
- 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/com.thoughtworks.paranamer_paranamer-2.6.jar at http://192.168.62.232:53366/jars/com.thoughtworks.paranamer_paranamer-2.6.jar with timestamp 1459804937782
- 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.scala-lang_scalap-2.10.0.jar at http://192.168.62.232:53366/jars/org.scala-lang_scalap-2.10.0.jar with timestamp 1459804937784
- 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.scala-lang_scala-compiler-2.10.0.jar at http://192.168.62.232:53366/jars/org.scala-lang_scala-compiler-2.10.0.jar with timestamp 1459804937813
- 16/04/04 23:22:17 INFO SparkContext: Added JAR file:/root/.ivy2/jars/org.scala-lang_scala-reflect-2.10.0.jar at http://192.168.62.232:53366/jars/org.scala-lang_scala-reflect-2.10.0.jar with timestamp 1459804937819
- spark.yarn.driver.memoryOverhead is set but does not apply in client mode.
- 16/04/04 23:22:17 INFO TimelineClientImpl: Timeline service address: http://compute01.mydomain:8188/ws/v1/timeline/
- 16/04/04 23:22:17 INFO RMProxy: Connecting to ResourceManager at compute01.mydomain/192.168.62.232:8050
- 16/04/04 23:22:17 INFO Client: Requesting a new application from cluster with 5 NodeManagers
- 16/04/04 23:22:17 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (10240 MB per container)
- 16/04/04 23:22:17 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
- 16/04/04 23:22:17 INFO Client: Setting up container launch context for our AM
- 16/04/04 23:22:17 INFO Client: Setting up the launch environment for our AM container
- 16/04/04 23:22:17 WARN Client: No spark assembly jar for HDP on HDFS, defaultSparkAssembly:hdfs://compute01.mydomain:8020/hdp/apps/2.4.0.0-169/spark/spark-hdp-assembly.jar
- 16/04/04 23:22:17 INFO Client: Preparing resources for our AM container
- 16/04/04 23:22:17 WARN Client: No spark assembly jar for HDP on HDFS, defaultSparkAssembly:hdfs://compute01.mydomain:8020/hdp/apps/2.4.0.0-169/spark/spark-hdp-assembly.jar
- 16/04/04 23:22:17 INFO Client: Uploading resource file:/opt/hosting/run/hdp/2.4.0.0-169/spark/lib/spark-assembly-1.6.0.2.4.0.0-169-hadoop2.7.1.2.4.0.0-169.jar -> hdfs://compute01.mydomain:8020/user/root/.sparkStaging/application_1459404819578_0102/spark-assembly-1.6.0.2.4.0.0-169-hadoop2.7.1.2.4.0.0-169.jar
- 16/04/04 23:22:19 INFO Client: Uploading resource file:/tmp/spark-0150e277-9e4d-4428-8edf-31a26ec138cf/__spark_conf__7928248898142720439.zip -> hdfs://compute01.mydomain:8020/user/root/.sparkStaging/application_1459404819578_0102/__spark_conf__7928248898142720439.zip
- 16/04/04 23:22:20 INFO SecurityManager: Changing view acls to: root
- 16/04/04 23:22:20 INFO SecurityManager: Changing modify acls to: root
- 16/04/04 23:22:20 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
- 16/04/04 23:22:20 INFO Client: Submitting application 102 to ResourceManager
- 16/04/04 23:22:20 INFO YarnClientImpl: Submitted application application_1459404819578_0102
- 16/04/04 23:22:20 INFO SchedulerExtensionServices: Starting Yarn extension services with app application_1459404819578_0102 and attemptId None
- 16/04/04 23:22:21 INFO Client: Application report for application_1459404819578_0102 (state: ACCEPTED)
- 16/04/04 23:22:21 INFO Client:
- client token: N/A
- diagnostics: N/A
- ApplicationMaster host: N/A
- ApplicationMaster RPC port: -1
- queue: default
- start time: 1459804940026
- final status: UNDEFINED
- tracking URL: http://compute01.mydomain:8088/proxy/application_1459404819578_0102/
- user: root
- 16/04/04 23:22:22 INFO Client: Application report for application_1459404819578_0102 (state: ACCEPTED)
- 16/04/04 23:22:23 INFO Client: Application report for application_1459404819578_0102 (state: ACCEPTED)
- 16/04/04 23:22:23 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(null)
- 16/04/04 23:22:23 INFO YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> compute01.mydomain, PROXY_URI_BASES -> http://compute01.mydomain:8088/proxy/application_1459404819578_0102), /proxy/application_1459404819578_0102
- 16/04/04 23:22:23 INFO JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
- 16/04/04 23:22:24 INFO Client: Application report for application_1459404819578_0102 (state: RUNNING)
- 16/04/04 23:22:24 INFO Client:
- client token: N/A
- diagnostics: N/A
- ApplicationMaster host: 192.168.62.235
- ApplicationMaster RPC port: 0
- queue: default
- start time: 1459804940026
- final status: UNDEFINED
- tracking URL: http://compute01.mydomain:8088/proxy/application_1459404819578_0102/
- user: root
- 16/04/04 23:22:24 INFO YarnClientSchedulerBackend: Application application_1459404819578_0102 has started running.
- 16/04/04 23:22:24 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 56680.
- 16/04/04 23:22:24 INFO NettyBlockTransferService: Server created on 56680
- 16/04/04 23:22:24 INFO BlockManagerMaster: Trying to register BlockManager
- 16/04/04 23:22:24 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.62.232:56680 with 458.6 MB RAM, BlockManagerId(driver, 192.168.62.232, 56680)
- 16/04/04 23:22:24 INFO BlockManagerMaster: Registered BlockManager
- 16/04/04 23:22:24 INFO EventLoggingListener: Logging events to hdfs:///spark-history/application_1459404819578_0102
- 16/04/04 23:22:26 INFO YarnClientSchedulerBackend: Registered executor NettyRpcEndpointRef(null) (compute02.mydomain:47398) with ID 1
- 16/04/04 23:22:26 INFO BlockManagerMasterEndpoint: Registering block manager compute02.mydomain:40262 with 511.1 MB RAM, BlockManagerId(1, compute02.mydomain, 40262)
- 16/04/04 23:22:27 INFO YarnClientSchedulerBackend: Registered executor NettyRpcEndpointRef(null) (192.168.62.201:36520) with ID 2
- 16/04/04 23:22:28 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.62.201:38977 with 511.1 MB RAM, BlockManagerId(2, 192.168.62.201, 38977)
- 16/04/04 23:22:28 INFO YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
- 16/04/04 23:22:28 WARN SparkContext: Multiple running SparkContexts detected in the same JVM!
- org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
- org.apache.spark.SparkContext.<init>(SparkContext.scala:82)
- org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
- $line3.$read$$iwC$$iwC.<init>(<console>:15)
- $line3.$read$$iwC.<init>(<console>:24)
- $line3.$read.<init>(<console>:26)
- $line3.$read$.<init>(<console>:30)
- $line3.$read$.<clinit>(<console>)
- $line3.$eval$.<init>(<console>:7)
- $line3.$eval$.<clinit>(<console>)
- $line3.$eval.$print(<console>)
- sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- java.lang.reflect.Method.invoke(Method.java:497)
- org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
- org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
- org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
- org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
- org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
- org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
- at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:2257)
- at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:2239)
- at scala.Option.foreach(Option.scala:236)
- at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2239)
- at org.apache.spark.SparkContext$.setActiveContext(SparkContext.scala:2325)
- at org.apache.spark.SparkContext.<init>(SparkContext.scala:2197)
- at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
- at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:50)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:52)
- at $line34.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:54)
- at $line34.$read$$iwC$$iwC$$iwC.<init>(<console>:56)
- at $line34.$read$$iwC$$iwC.<init>(<console>:58)
- at $line34.$read$$iwC.<init>(<console>:60)
- at $line34.$read.<init>(<console>:62)
- at $line34.$read$.<init>(<console>:66)
- at $line34.$read$.<clinit>(<console>)
- at $line34.$eval$.<init>(<console>:7)
- at $line34.$eval$.<clinit>(<console>)
- at $line34.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:497)
- at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
- at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
- at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
- at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
- at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
- at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
- at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
- at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
- at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
- at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:680)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1$$anonfun$apply$mcV$sp$2.apply(SparkILoop.scala:677)
- at scala.reflect.io.Streamable$Chars$class.applyReader(Streamable.scala:104)
- at scala.reflect.io.File.applyReader(File.scala:82)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SparkILoop.scala:677)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:677)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:677)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$savingReplayStack(SparkILoop.scala:162)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply$mcV$sp(SparkILoop.scala:676)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply(SparkILoop.scala:676)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$interpretAllFrom$1.apply(SparkILoop.scala:676)
- at org.apache.spark.repl.SparkILoop.savingReader(SparkILoop.scala:167)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$interpretAllFrom(SparkILoop.scala:675)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadCommand$1.apply(SparkILoop.scala:740)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadCommand$1.apply(SparkILoop.scala:739)
- at org.apache.spark.repl.SparkILoop.withFile(SparkILoop.scala:733)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loadCommand(SparkILoop.scala:739)
- at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$7.apply(SparkILoop.scala:344)
- at org.apache.spark.repl.SparkILoop$$anonfun$standardCommands$7.apply(SparkILoop.scala:344)
- at scala.tools.nsc.interpreter.LoopCommands$LineCmd.apply(LoopCommands.scala:81)
- at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:809)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadFiles$1.apply(SparkILoop.scala:910)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$loadFiles$1.apply(SparkILoop.scala:908)
- at scala.collection.immutable.List.foreach(List.scala:318)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loadFiles(SparkILoop.scala:908)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:995)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
- at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
- at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
- at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
- at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
- at org.apache.spark.repl.Main$.main(Main.scala:31)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:497)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- ssc: org.apache.spark.streaming.StreamingContext = org.apache.spark.streaming.StreamingContext@6f6a3391
- scala> Stopping spark context.
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static/sql,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/execution/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/execution,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/SQL,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
- 16/04/04 23:22:37 INFO SparkUI: Stopped Spark web UI at http://192.168.62.232:4041
- 16/04/04 23:22:37 INFO YarnClientSchedulerBackend: Shutting down all executors
- 16/04/04 23:22:37 INFO YarnClientSchedulerBackend: Interrupting monitor thread
- 16/04/04 23:22:37 INFO YarnClientSchedulerBackend: Asking each executor to shut down
- 16/04/04 23:22:37 INFO SchedulerExtensionServices: Stopping SchedulerExtensionServices
- (serviceOption=None,
- services=List(),
- started=false)
- 16/04/04 23:22:37 INFO YarnClientSchedulerBackend: Stopped
- 16/04/04 23:22:37 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
- 16/04/04 23:22:37 INFO MemoryStore: MemoryStore cleared
- 16/04/04 23:22:37 INFO BlockManager: BlockManager stopped
- 16/04/04 23:22:37 INFO BlockManagerMaster: BlockManagerMaster stopped
- 16/04/04 23:22:37 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
- 16/04/04 23:22:37 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
- 16/04/04 23:22:37 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
- 16/04/04 23:22:37 INFO SparkContext: Successfully stopped SparkContext
- 16/04/04 23:22:37 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
- 16/04/04 23:22:37 INFO SparkContext: Invoking stop() from shutdown hook
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
- 16/04/04 23:22:37 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
- 16/04/04 23:22:37 INFO SparkUI: Stopped Spark web UI at http://192.168.62.232:4042
- 16/04/04 23:22:37 INFO YarnClientSchedulerBackend: Shutting down all executors
- 16/04/04 23:22:37 INFO YarnClientSchedulerBackend: Interrupting monitor thread
- 16/04/04 23:22:37 INFO YarnClientSchedulerBackend: Asking each executor to shut down
- 16/04/04 23:22:37 INFO SchedulerExtensionServices: Stopping SchedulerExtensionServices
- (serviceOption=None,
- services=List(),
- started=false)
- 16/04/04 23:22:37 ERROR Utils: Uncaught exception in thread Thread-0
- org.apache.spark.SparkException: YarnSparkHadoopUtil is not available in non-YARN mode!
- at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil$.get(YarnSparkHadoopUtil.scala:241)
- at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.stop(YarnClientSchedulerBackend.scala:189)
- at org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:446)
- at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1582)
- at org.apache.spark.SparkContext$$anonfun$stop$7.apply$mcV$sp(SparkContext.scala:1731)
- at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1229)
- at org.apache.spark.SparkContext.stop(SparkContext.scala:1730)
- at org.apache.spark.SparkContext$$anonfun$3.apply$mcV$sp(SparkContext.scala:596)
- at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:267)
- at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:239)
- at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:239)
- at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:239)
- at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1741)
- at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:239)
- at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:239)
- at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:239)
- at scala.util.Try$.apply(Try.scala:161)
- at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:239)
- at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:218)
- at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
- 16/04/04 23:22:37 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
- 16/04/04 23:22:37 INFO MemoryStore: MemoryStore cleared
- 16/04/04 23:22:37 INFO BlockManager: BlockManager stopped
- 16/04/04 23:22:37 INFO BlockManagerMaster: BlockManagerMaster stopped
- 16/04/04 23:22:37 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
- 16/04/04 23:22:37 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
- 16/04/04 23:22:37 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
- 16/04/04 23:22:37 INFO SparkContext: Successfully stopped SparkContext
- 16/04/04 23:22:37 INFO ShutdownHookManager: Shutdown hook called
- 16/04/04 23:22:37 INFO ShutdownHookManager: Deleting directory /tmp/spark-0150e277-9e4d-4428-8edf-31a26ec138cf/httpd-431dcfa2-e53c-4e9c-8177-e4d6c11d4e9d
- 16/04/04 23:22:37 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
- 16/04/04 23:22:37 INFO ShutdownHookManager: Deleting directory /tmp/spark-c20ba538-639e-49f2-a8ce-2257fd1ec2c9
- 16/04/04 23:22:37 INFO ShutdownHookManager: Deleting directory /tmp/spark-0150e277-9e4d-4428-8edf-31a26ec138cf/httpd-cdb4637f-e6f8-4201-99c7-6e3794c3d88d
- 16/04/04 23:22:37 INFO ShutdownHookManager: Deleting directory /tmp/spark-0150e277-9e4d-4428-8edf-31a26ec138cf
- 16/04/04 23:22:37 INFO ShutdownHookManager: Deleting directory /tmp/spark-688f90c6-23e4-4676-a369-4cb1e70725a0
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement