Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- [sfox@anant.saama.com@SU-VM-177 DataAquisition]$ spark-submit /home/sfox@anant.saama.com/Python/DataAquisition/DataAquisition.py
- 2019-03-21 16:01:38 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
- 2019-03-21 16:01:39 INFO SparkContext:54 - Running Spark version 2.4.0
- 2019-03-21 16:01:39 INFO SparkContext:54 - Submitted application: WeatherData
- 2019-03-21 16:01:39 INFO SecurityManager:54 - Changing view acls to: sfox@anant.saama.com,sfox
- 2019-03-21 16:01:39 INFO SecurityManager:54 - Changing modify acls to: sfox@anant.saama.com,sfox
- 2019-03-21 16:01:39 INFO SecurityManager:54 - Changing view acls groups to:
- 2019-03-21 16:01:39 INFO SecurityManager:54 - Changing modify acls groups to:
- 2019-03-21 16:01:39 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(sfox@anant.saama.com, sfox); groups with view permissions: Set(); users with modify permissions: Set(sfox@anant.saama.com, sfox); groups with modify permissions: Set()
- 2019-03-21 16:01:39 INFO Utils:54 - Successfully started service 'sparkDriver' on port 44849.
- 2019-03-21 16:01:39 INFO SparkEnv:54 - Registering MapOutputTracker
- 2019-03-21 16:01:39 INFO SparkEnv:54 - Registering BlockManagerMaster
- 2019-03-21 16:01:39 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
- 2019-03-21 16:01:39 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
- 2019-03-21 16:01:39 INFO DiskBlockManager:54 - Created local directory at /tmp/blockmgr-cb4db84c-7612-4c18-bd75-184fd020b554
- 2019-03-21 16:01:39 INFO MemoryStore:54 - MemoryStore started with capacity 366.3 MB
- 2019-03-21 16:01:39 INFO SparkEnv:54 - Registering OutputCommitCoordinator
- 2019-03-21 16:01:39 INFO log:192 - Logging initialized @2164ms
- 2019-03-21 16:01:39 INFO Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
- 2019-03-21 16:01:39 INFO Server:419 - Started @2245ms
- 2019-03-21 16:01:39 INFO AbstractConnector:278 - Started ServerConnector@6b074365{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
- 2019-03-21 16:01:39 INFO Utils:54 - Successfully started service 'SparkUI' on port 4040.
- 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5ca8b985{/jobs,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@16beb9d6{/jobs/json,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@67a95118{/jobs/job,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@14369976{/jobs/job/json,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@77a5e22f{/stages,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5a294c52{/stages/json,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@11e13d1c{/stages/stage,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@798b5949{/stages/stage/json,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@611a92b9{/stages/pool,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@37abfae6{/stages/pool/json,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@588b98d5{/storage,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7b69d39c{/storage/json,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4e530912{/storage/rdd,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@26c7a6f4{/storage/rdd/json,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@220edfc9{/environment,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@17ce0d27{/environment/json,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@41839d55{/executors,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5a687397{/executors/json,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@e4c0f84{/executors/threadDump,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:39 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@690c9c5f{/executors/threadDump/json,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:40 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@34b8dd1e{/static,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:40 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@52b9ae1d{/,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:40 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@de67f0{/api,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:40 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@57daedff{/jobs/job/kill,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:40 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1d906b3f{/stages/stage/kill,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:40 INFO SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at http://SU-VM-177.anant.saama.com:4040
- 2019-03-21 16:01:40 INFO Executor:54 - Starting executor ID driver on host localhost
- 2019-03-21 16:01:40 INFO Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 41182.
- 2019-03-21 16:01:40 INFO NettyBlockTransferService:54 - Server created on SU-VM-177.anant.saama.com:41182
- 2019-03-21 16:01:40 INFO BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
- 2019-03-21 16:01:40 INFO BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, SU-VM-177.anant.saama.com, 41182, None)
- 2019-03-21 16:01:40 INFO BlockManagerMasterEndpoint:54 - Registering block manager SU-VM-177.anant.saama.com:41182 with 366.3 MB RAM, BlockManagerId(driver, SU-VM-177.anant.saama.com, 41182, None)
- 2019-03-21 16:01:40 INFO BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, SU-VM-177.anant.saama.com, 41182, None)
- 2019-03-21 16:01:40 INFO BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, SU-VM-177.anant.saama.com, 41182, None)
- 2019-03-21 16:01:40 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2b61293c{/metrics/json,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:40 INFO SharedState:54 - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/home/sfox@anant.saama.com/Python/DataAquisition/spark-warehouse').
- 2019-03-21 16:01:40 INFO SharedState:54 - Warehouse path is 'file:/home/sfox@anant.saama.com/Python/DataAquisition/spark-warehouse'.
- 2019-03-21 16:01:40 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@480ec1f{/SQL,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:40 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@64e3ae49{/SQL/json,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:40 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@76c900d6{/SQL/execution,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:40 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4df5725d{/SQL/execution/json,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:40 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@629a4242{/static/sql,null,AVAILABLE,@Spark}
- 2019-03-21 16:01:41 INFO StateStoreCoordinatorRef:54 - Registered StateStoreCoordinator endpoint
- /usr/lib/python2.7/site-packages/pyspark/python/lib/pyspark.zip/pyspark/sql/session.py:346: UserWarning: inferring schema from dict is deprecated,please use pyspark.sql.Row instead
- Traceback (most recent call last):
- File "/home/sfox@anant.saama.com/Python/DataAquisition/DataAquisition.py", line 158, in <module>
- main()
- File "/home/sfox@anant.saama.com/Python/DataAquisition/DataAquisition.py", line 136, in main
- loadData(testData)
- File "/home/sfox@anant.saama.com/Python/DataAquisition/DataAquisition.py", line 45, in loadData
- sqlContext.createDataFrame(data).write.saveAsTable("landing.weather", format="orc", mode="append")
- File "/usr/lib/python2.7/site-packages/pyspark/python/lib/pyspark.zip/pyspark/sql/readwriter.py", line 775, in saveAsTable
- File "/usr/lib/python2.7/site-packages/pyspark/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__
- File "/usr/lib/python2.7/site-packages/pyspark/python/lib/pyspark.zip/pyspark/sql/utils.py", line 71, in deco
- pyspark.sql.utils.AnalysisException: u"Database 'landing' not found;"
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement