Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- import org.apache.spark.SparkContext
- import org.apache.spark.SparkConf
- object SimpleScalaSpark {
- def main(args: Array[String]) {
- val conf = new SparkConf().setAppName("AppName").setMaster(master = "spark://comp:7077")
- val sc = new SparkContext(conf)
- val data = Array(1,2,3,4,5)
- val distData = sc.parallelize(data)
- println("application closed")
- }
- }
- Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
- 17/01/11 23:06:06 INFO SparkContext: Running Spark version 2.1.0
- **17/01/11 23:06:06 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
- 17/01/11 23:06:06 WARN Utils: Your hostname, comp resolves to a loopback address: 127.0.1.1; using 192.168.1.9 instead (on interface wlp13s0)
- 17/01/11 23:06:06 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address**
- 17/01/11 23:06:07 INFO SecurityManager: Changing view acls to: vovanrew
- 17/01/11 23:06:07 INFO SecurityManager: Changing modify acls to: vovanrew
- 17/01/11 23:06:07 INFO SecurityManager: Changing view acls groups to:
- 17/01/11 23:06:07 INFO SecurityManager: Changing modify acls groups to:
- 17/01/11 23:06:07 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(vovanrew); groups with view permissions: Set(); users with modify permissions: Set(vovanrew); groups with modify permissions: Set()
- 17/01/11 23:06:07 INFO Utils: Successfully started service 'sparkDriver' on port 40842.
- 17/01/11 23:06:07 INFO SparkEnv: Registering MapOutputTracker
- 17/01/11 23:06:07 INFO SparkEnv: Registering BlockManagerMaster
- 17/01/11 23:06:07 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
- 17/01/11 23:06:07 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
- 17/01/11 23:06:07 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-02547588-1e34-4fc6-9d3f-87665dd573c0
- 17/01/11 23:06:07 INFO MemoryStore: MemoryStore started with capacity 323.7 MB
- 17/01/11 23:06:07 INFO SparkEnv: Registering OutputCommitCoordinator
- 17/01/11 23:06:08 INFO Utils: Successfully started service 'SparkUI' on port 4040.
- 17/01/11 23:06:08 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.1.9:4040
- 17/01/11 23:06:08 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://comp:7077...
- 17/01/11 23:06:08 INFO TransportClientFactory: Successfully created connection to comp/127.0.1.1:7077 after 49 ms (0 ms spent in bootstraps)
- 17/01/11 23:06:08 INFO StandaloneSchedulerBackend: Connected to Spark cluster with app ID app-20170111230608-0002
- 17/01/11 23:06:08 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 34475.
- 17/01/11 23:06:08 INFO NettyBlockTransferService: Server created on 192.168.1.9:34475
- 17/01/11 23:06:08 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
- 17/01/11 23:06:08 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.1.9, 34475, None)
- 17/01/11 23:06:08 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.1.9:34475 with 323.7 MB RAM, BlockManagerId(driver, 192.168.1.9, 34475, None)
- 17/01/11 23:06:08 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.1.9, 34475, None)
- 17/01/11 23:06:08 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.1.9, 34475, None)
- 17/01/11 23:06:08 INFO StandaloneSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
- application closed
- 17/01/11 23:06:09 INFO SparkContext: Invoking stop() from shutdown hook
- 17/01/11 23:06:09 INFO SparkUI: Stopped Spark web UI at http://192.168.1.9:4040
- 17/01/11 23:06:09 INFO StandaloneSchedulerBackend: Shutting down all executors
- 17/01/11 23:06:09 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking each executor to shut down
- 17/01/11 23:06:09 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
- 17/01/11 23:06:09 INFO MemoryStore: MemoryStore cleared
- 17/01/11 23:06:09 INFO BlockManager: BlockManager stopped
- 17/01/11 23:06:09 INFO BlockManagerMaster: BlockManagerMaster stopped
- 17/01/11 23:06:09 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
- 17/01/11 23:06:09 INFO SparkContext: Successfully stopped SparkContext
- 17/01/11 23:06:09 INFO ShutdownHookManager: Shutdown hook called
- 17/01/11 23:06:09 INFO ShutdownHookManager: Deleting directory /tmp/spark-694473a2-e111-4917-af48-2f3c22787fdf
- Process finished with exit code 0
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement