Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- Launch Command: "/usr/lib/jvm/java-8-openjdk-amd64/bin/java" "-cp" "/home/unique/spark-2.1.0-bin-hadoop2.7/conf/:/home/unique/spark-2.1.0-bin-hadoop2.7/jars/*" "-Xmx1024M" "-Dspark.jars=file:/home/unique/spark-2.1.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.1.0.jar" "-Dspark.driver.supervise=false" "-Dspark.app.name=org.apache.spark.examples.ml.JavaLDAExample" "-Dspark.submit.deployMode=cluster" "-Dspark.master=spark://master:7077" "org.apache.spark.deploy.worker.DriverWrapper" "spark://[email protected]:33951" "/home/unique/spark-2.1.0-bin-hadoop2.7/work/driver-20170130015037-0017/spark-examples_2.11-2.1.0.jar" "org.apache.spark.examples.ml.JavaLDAExample" "hdfs://master:9000/input/data/test.txt"
- ========================================
- log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
- log4j:WARN Please initialize the log4j system properly.
- log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
- Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
- 17/01/30 01:50:38 INFO SecurityManager: Changing view acls to: unique
- 17/01/30 01:50:38 INFO SecurityManager: Changing modify acls to: unique
- 17/01/30 01:50:38 INFO SecurityManager: Changing view acls groups to:
- 17/01/30 01:50:38 INFO SecurityManager: Changing modify acls groups to:
- 17/01/30 01:50:38 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(unique); groups with view permissions: Set(); users with modify permissions: Set(unique); groups with modify permissions: Set()
- 17/01/30 01:50:39 INFO Utils: Successfully started service 'Driver' on port 33554.
- 17/01/30 01:50:39 INFO WorkerWatcher: Connecting to worker spark://[email protected]:33951
- 17/01/30 01:50:39 INFO TransportClientFactory: Successfully created connection to /10.221.55.8:33951 after 21 ms (0 ms spent in bootstraps)
- 17/01/30 01:50:39 INFO WorkerWatcher: Successfully connected to spark://[email protected]:33951
- 17/01/30 01:50:39 INFO SparkContext: Running Spark version 2.1.0
- 17/01/30 01:50:39 INFO SecurityManager: Changing view acls to: unique
- 17/01/30 01:50:39 INFO SecurityManager: Changing modify acls to: unique
- 17/01/30 01:50:39 INFO SecurityManager: Changing view acls groups to:
- 17/01/30 01:50:39 INFO SecurityManager: Changing modify acls groups to:
- 17/01/30 01:50:39 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(unique); groups with view permissions: Set(); users with modify permissions: Set(unique); groups with modify permissions: Set()
- 17/01/30 01:50:39 INFO Utils: Successfully started service 'sparkDriver' on port 40937.
- 17/01/30 01:50:39 INFO SparkEnv: Registering MapOutputTracker
- 17/01/30 01:50:39 INFO SparkEnv: Registering BlockManagerMaster
- 17/01/30 01:50:39 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
- 17/01/30 01:50:39 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
- 17/01/30 01:50:39 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-bcee56e1-2aa7-476c-bc2a-6a373cee3382
- 17/01/30 01:50:39 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
- 17/01/30 01:50:39 INFO SparkEnv: Registering OutputCommitCoordinator
- 17/01/30 01:50:39 INFO Utils: Successfully started service 'SparkUI' on port 4040.
- 17/01/30 01:50:39 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.221.55.8:4040
- 17/01/30 01:50:39 INFO SparkContext: Added JAR file:/home/unique/spark-2.1.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.1.0.jar at spark://10.221.55.8:40937/jars/spark-examples_2.11-2.1.0.jar with timestamp 1485728439575
- 17/01/30 01:50:39 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://master:7077...
- 17/01/30 01:50:39 INFO TransportClientFactory: Successfully created connection to master/10.221.55.8:7077 after 1 ms (0 ms spent in bootstraps)
- 17/01/30 01:50:39 INFO StandaloneSchedulerBackend: Connected to Spark cluster with app ID app-20170130015039-0013
- 17/01/30 01:50:39 INFO StandaloneAppClient$ClientEndpoint: Executor added: app-20170130015039-0013/0 on worker-20170129221449-10.221.55.10-45483 (10.221.55.10:45483) with 8 cores
- 17/01/30 01:50:39 INFO StandaloneSchedulerBackend: Granted executor ID app-20170130015039-0013/0 on hostPort 10.221.55.10:45483 with 8 cores, 1024.0 MB RAM
- 17/01/30 01:50:39 INFO StandaloneAppClient$ClientEndpoint: Executor added: app-20170130015039-0013/1 on worker-20170129221444-10.221.55.8-33951 (10.221.55.8:33951) with 3 cores
- 17/01/30 01:50:39 INFO StandaloneSchedulerBackend: Granted executor ID app-20170130015039-0013/1 on hostPort 10.221.55.8:33951 with 3 cores, 1024.0 MB RAM
- 17/01/30 01:50:39 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 43522.
- 17/01/30 01:50:39 INFO NettyBlockTransferService: Server created on 10.221.55.8:43522
- 17/01/30 01:50:39 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
- 17/01/30 01:50:39 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.221.55.8, 43522, None)
- 17/01/30 01:50:39 INFO BlockManagerMasterEndpoint: Registering block manager 10.221.55.8:43522 with 366.3 MB RAM, BlockManagerId(driver, 10.221.55.8, 43522, None)
- 17/01/30 01:50:39 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.221.55.8, 43522, None)
- 17/01/30 01:50:39 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.221.55.8, 43522, None)
- 17/01/30 01:50:39 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20170130015039-0013/0 is now RUNNING
- 17/01/30 01:50:39 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20170130015039-0013/1 is now RUNNING
- 17/01/30 01:50:40 INFO StandaloneSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
- 17/01/30 01:50:40 INFO SharedState: Warehouse path is 'file:/home/unique/spark-2.1.0-bin-hadoop2.7/work/driver-20170130015037-0017/spark-warehouse'.
- Exception in thread "main" java.lang.reflect.InvocationTargetException
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:58)
- at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
- Caused by: org.apache.spark.sql.AnalysisException: Path does not exist: file:/home/unique/spark-2.1.0-bin-hadoop2.7/work/driver-20170130015037-0017/data/mllib/sample_lda_libsvm_data.txt;
- at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$14.apply(DataSource.scala:382)
- at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$14.apply(DataSource.scala:370)
- at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
- at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
- at scala.collection.immutable.List.foreach(List.scala:381)
- at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
- at scala.collection.immutable.List.flatMap(List.scala:344)
- at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:370)
- at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:152)
- at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:135)
- at org.apache.spark.examples.ml.JavaLDAExample.main(JavaLDAExample.java:45)
- ... 6 more
- 17/01/30 01:50:40 INFO SparkContext: Invoking stop() from shutdown hook
- 17/01/30 01:50:40 INFO SparkUI: Stopped Spark web UI at http://10.221.55.8:4040
- 17/01/30 01:50:40 INFO StandaloneSchedulerBackend: Shutting down all executors
- 17/01/30 01:50:40 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking each executor to shut down
- 17/01/30 01:50:40 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
- 17/01/30 01:50:40 INFO MemoryStore: MemoryStore cleared
- 17/01/30 01:50:40 INFO BlockManager: BlockManager stopped
- 17/01/30 01:50:40 INFO BlockManagerMaster: BlockManagerMaster stopped
- 17/01/30 01:50:40 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
- 17/01/30 01:50:40 INFO SparkContext: Successfully stopped SparkContext
- 17/01/30 01:50:40 INFO ShutdownHookManager: Shutdown hook called
- 17/01/30 01:50:40 INFO ShutdownHookManager: Deleting directory /tmp/spark-579cf554-ddc9-46d2-8703-27c43f4970f2
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement