SHARE
TWEET

Untitled

a guest Jan 11th, 2017 68 Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
  1. import org.apache.spark.SparkContext
  2. import org.apache.spark.SparkConf
  3.  
  4.  
  5. object SimpleScalaSpark {
  6.  
  7.   def main(args: Array[String]) {
  8.     val conf = new SparkConf().setAppName("AppName").setMaster(master = "spark://comp:7077")
  9.     val sc = new SparkContext(conf)
  10.  
  11.     val data = Array(1,2,3,4,5)
  12.     val distData = sc.parallelize(data)
  13.  
  14.     println("application closed")
  15.   }
  16. }
  17.    
  18. Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
  19. 17/01/11 23:06:06 INFO SparkContext: Running Spark version 2.1.0
  20. **17/01/11 23:06:06 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  21. 17/01/11 23:06:06 WARN Utils: Your hostname, comp resolves to a loopback address: 127.0.1.1; using 192.168.1.9 instead (on interface wlp13s0)
  22. 17/01/11 23:06:06 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address**
  23. 17/01/11 23:06:07 INFO SecurityManager: Changing view acls to: vovanrew
  24. 17/01/11 23:06:07 INFO SecurityManager: Changing modify acls to: vovanrew
  25. 17/01/11 23:06:07 INFO SecurityManager: Changing view acls groups to:
  26. 17/01/11 23:06:07 INFO SecurityManager: Changing modify acls groups to:
  27. 17/01/11 23:06:07 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(vovanrew); groups with view permissions: Set(); users  with modify permissions: Set(vovanrew); groups with modify permissions: Set()
  28. 17/01/11 23:06:07 INFO Utils: Successfully started service 'sparkDriver' on port 40842.
  29. 17/01/11 23:06:07 INFO SparkEnv: Registering MapOutputTracker
  30. 17/01/11 23:06:07 INFO SparkEnv: Registering BlockManagerMaster
  31. 17/01/11 23:06:07 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
  32. 17/01/11 23:06:07 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
  33. 17/01/11 23:06:07 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-02547588-1e34-4fc6-9d3f-87665dd573c0
  34. 17/01/11 23:06:07 INFO MemoryStore: MemoryStore started with capacity 323.7 MB
  35. 17/01/11 23:06:07 INFO SparkEnv: Registering OutputCommitCoordinator
  36. 17/01/11 23:06:08 INFO Utils: Successfully started service 'SparkUI' on port 4040.
  37. 17/01/11 23:06:08 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.1.9:4040
  38. 17/01/11 23:06:08 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://comp:7077...
  39. 17/01/11 23:06:08 INFO TransportClientFactory: Successfully created connection to comp/127.0.1.1:7077 after 49 ms (0 ms spent in bootstraps)
  40. 17/01/11 23:06:08 INFO StandaloneSchedulerBackend: Connected to Spark cluster with app ID app-20170111230608-0002
  41. 17/01/11 23:06:08 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 34475.
  42. 17/01/11 23:06:08 INFO NettyBlockTransferService: Server created on 192.168.1.9:34475
  43. 17/01/11 23:06:08 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
  44. 17/01/11 23:06:08 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.1.9, 34475, None)
  45. 17/01/11 23:06:08 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.1.9:34475 with 323.7 MB RAM, BlockManagerId(driver, 192.168.1.9, 34475, None)
  46. 17/01/11 23:06:08 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.1.9, 34475, None)
  47. 17/01/11 23:06:08 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.1.9, 34475, None)
  48. 17/01/11 23:06:08 INFO StandaloneSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
  49. application closed
  50. 17/01/11 23:06:09 INFO SparkContext: Invoking stop() from shutdown hook
  51. 17/01/11 23:06:09 INFO SparkUI: Stopped Spark web UI at http://192.168.1.9:4040
  52. 17/01/11 23:06:09 INFO StandaloneSchedulerBackend: Shutting down all executors
  53. 17/01/11 23:06:09 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking each executor to shut down
  54. 17/01/11 23:06:09 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
  55. 17/01/11 23:06:09 INFO MemoryStore: MemoryStore cleared
  56. 17/01/11 23:06:09 INFO BlockManager: BlockManager stopped
  57. 17/01/11 23:06:09 INFO BlockManagerMaster: BlockManagerMaster stopped
  58. 17/01/11 23:06:09 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
  59. 17/01/11 23:06:09 INFO SparkContext: Successfully stopped SparkContext
  60. 17/01/11 23:06:09 INFO ShutdownHookManager: Shutdown hook called
  61. 17/01/11 23:06:09 INFO ShutdownHookManager: Deleting directory /tmp/spark-694473a2-e111-4917-af48-2f3c22787fdf
  62.  
  63. Process finished with exit code 0
RAW Paste Data
Top