Advertisement
Guest User

My User

a guest
Mar 21st, 2019
188
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 8.88 KB | None | 0 0
  1. [sfox@anant.saama.com@SU-VM-177 DataAquisition]$ spark-submit /home/sfox@anant.saama.com/Python/DataAquisition/DataAquisition.py
  2. 2019-03-21 16:01:38 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  3. 2019-03-21 16:01:39 INFO SparkContext:54 - Running Spark version 2.4.0
  4. 2019-03-21 16:01:39 INFO SparkContext:54 - Submitted application: WeatherData
  5. 2019-03-21 16:01:39 INFO SecurityManager:54 - Changing view acls to: sfox@anant.saama.com,sfox
  6. 2019-03-21 16:01:39 INFO SecurityManager:54 - Changing modify acls to: sfox@anant.saama.com,sfox
  7. 2019-03-21 16:01:39 INFO SecurityManager:54 - Changing view acls groups to:
  8. 2019-03-21 16:01:39 INFO SecurityManager:54 - Changing modify acls groups to:
  9. 2019-03-21 16:01:39 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(sfox@anant.saama.com, sfox); groups with view permissions: Set(); users with modify permissions: Set(sfox@anant.saama.com, sfox); groups with modify permissions: Set()
  10. 2019-03-21 16:01:39 INFO Utils:54 - Successfully started service 'sparkDriver' on port 44849.
  11. 2019-03-21 16:01:39 INFO SparkEnv:54 - Registering MapOutputTracker
  12. 2019-03-21 16:01:39 INFO SparkEnv:54 - Registering BlockManagerMaster
  13. 2019-03-21 16:01:39 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
  14. 2019-03-21 16:01:39 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
  15. 2019-03-21 16:01:39 INFO DiskBlockManager:54 - Created local directory at /tmp/blockmgr-cb4db84c-7612-4c18-bd75-184fd020b554
  16. 2019-03-21 16:01:39 INFO MemoryStore:54 - MemoryStore started with capacity 366.3 MB
  17. 2019-03-21 16:01:39 INFO SparkEnv:54 - Registering OutputCommitCoordinator
  18. 2019-03-21 16:01:39 INFO log:192 - Logging initialized @2164ms
  19. 2019-03-21 16:01:39 INFO Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
  20. 2019-03-21 16:01:39 INFO Server:419 - Started @2245ms
  21. 2019-03-21 16:01:39 INFO AbstractConnector:278 - Started ServerConnector@6b074365{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
  22. 2019-03-21 16:01:39 INFO Utils:54 - Successfully started service 'SparkUI' on port 4040.
  23. 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5ca8b985{/jobs,null,AVAILABLE,@Spark}
  24. 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@16beb9d6{/jobs/json,null,AVAILABLE,@Spark}
  25. 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@67a95118{/jobs/job,null,AVAILABLE,@Spark}
  26. 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@14369976{/jobs/job/json,null,AVAILABLE,@Spark}
  27. 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@77a5e22f{/stages,null,AVAILABLE,@Spark}
  28. 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5a294c52{/stages/json,null,AVAILABLE,@Spark}
  29. 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@11e13d1c{/stages/stage,null,AVAILABLE,@Spark}
  30. 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@798b5949{/stages/stage/json,null,AVAILABLE,@Spark}
  31. 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@611a92b9{/stages/pool,null,AVAILABLE,@Spark}
  32. 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@37abfae6{/stages/pool/json,null,AVAILABLE,@Spark}
  33. 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@588b98d5{/storage,null,AVAILABLE,@Spark}
  34. 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7b69d39c{/storage/json,null,AVAILABLE,@Spark}
  35. 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4e530912{/storage/rdd,null,AVAILABLE,@Spark}
  36. 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@26c7a6f4{/storage/rdd/json,null,AVAILABLE,@Spark}
  37. 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@220edfc9{/environment,null,AVAILABLE,@Spark}
  38. 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@17ce0d27{/environment/json,null,AVAILABLE,@Spark}
  39. 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@41839d55{/executors,null,AVAILABLE,@Spark}
  40. 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5a687397{/executors/json,null,AVAILABLE,@Spark}
  41. 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@e4c0f84{/executors/threadDump,null,AVAILABLE,@Spark}
  42. 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@690c9c5f{/executors/threadDump/json,null,AVAILABLE,@Spark}
  43. 2019-03-21 16:01:40 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@34b8dd1e{/static,null,AVAILABLE,@Spark}
  44. 2019-03-21 16:01:40 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@52b9ae1d{/,null,AVAILABLE,@Spark}
  45. 2019-03-21 16:01:40 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@de67f0{/api,null,AVAILABLE,@Spark}
  46. 2019-03-21 16:01:40 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@57daedff{/jobs/job/kill,null,AVAILABLE,@Spark}
  47. 2019-03-21 16:01:40 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1d906b3f{/stages/stage/kill,null,AVAILABLE,@Spark}
  48. 2019-03-21 16:01:40 INFO SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at http://SU-VM-177.anant.saama.com:4040
  49. 2019-03-21 16:01:40 INFO Executor:54 - Starting executor ID driver on host localhost
  50. 2019-03-21 16:01:40 INFO Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 41182.
  51. 2019-03-21 16:01:40 INFO NettyBlockTransferService:54 - Server created on SU-VM-177.anant.saama.com:41182
  52. 2019-03-21 16:01:40 INFO BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
  53. 2019-03-21 16:01:40 INFO BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, SU-VM-177.anant.saama.com, 41182, None)
  54. 2019-03-21 16:01:40 INFO BlockManagerMasterEndpoint:54 - Registering block manager SU-VM-177.anant.saama.com:41182 with 366.3 MB RAM, BlockManagerId(driver, SU-VM-177.anant.saama.com, 41182, None)
  55. 2019-03-21 16:01:40 INFO BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, SU-VM-177.anant.saama.com, 41182, None)
  56. 2019-03-21 16:01:40 INFO BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, SU-VM-177.anant.saama.com, 41182, None)
  57. 2019-03-21 16:01:40 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2b61293c{/metrics/json,null,AVAILABLE,@Spark}
  58. 2019-03-21 16:01:40 INFO SharedState:54 - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/home/sfox@anant.saama.com/Python/DataAquisition/spark-warehouse').
  59. 2019-03-21 16:01:40 INFO SharedState:54 - Warehouse path is 'file:/home/sfox@anant.saama.com/Python/DataAquisition/spark-warehouse'.
  60. 2019-03-21 16:01:40 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@480ec1f{/SQL,null,AVAILABLE,@Spark}
  61. 2019-03-21 16:01:40 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@64e3ae49{/SQL/json,null,AVAILABLE,@Spark}
  62. 2019-03-21 16:01:40 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@76c900d6{/SQL/execution,null,AVAILABLE,@Spark}
  63. 2019-03-21 16:01:40 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4df5725d{/SQL/execution/json,null,AVAILABLE,@Spark}
  64. 2019-03-21 16:01:40 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@629a4242{/static/sql,null,AVAILABLE,@Spark}
  65. 2019-03-21 16:01:41 INFO StateStoreCoordinatorRef:54 - Registered StateStoreCoordinator endpoint
  66. /usr/lib/python2.7/site-packages/pyspark/python/lib/pyspark.zip/pyspark/sql/session.py:346: UserWarning: inferring schema from dict is deprecated,please use pyspark.sql.Row instead
  67. Traceback (most recent call last):
  68. File "/home/sfox@anant.saama.com/Python/DataAquisition/DataAquisition.py", line 158, in <module>
  69. main()
  70. File "/home/sfox@anant.saama.com/Python/DataAquisition/DataAquisition.py", line 136, in main
  71. loadData(testData)
  72. File "/home/sfox@anant.saama.com/Python/DataAquisition/DataAquisition.py", line 45, in loadData
  73. sqlContext.createDataFrame(data).write.saveAsTable("landing.weather", format="orc", mode="append")
  74. File "/usr/lib/python2.7/site-packages/pyspark/python/lib/pyspark.zip/pyspark/sql/readwriter.py", line 775, in saveAsTable
  75. File "/usr/lib/python2.7/site-packages/pyspark/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__
  76. File "/usr/lib/python2.7/site-packages/pyspark/python/lib/pyspark.zip/pyspark/sql/utils.py", line 71, in deco
  77. pyspark.sql.utils.AnalysisException: u"Database 'landing' not found;"
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement