Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- Script started on Thu Mar 30 08:43:31 2017
- command: ./bin/spark-shell --master local[4]
- Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
- Setting default log level to "WARN".
- To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
- 17/03/30 08:43:39 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
- 17/03/30 08:43:39 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
- 17/03/30 08:43:41 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/Users/james/spark/jars/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/Users/james/spark-2.1.0-bin-hadoop2.7/jars/datanucleus-api-jdo-3.2.6.jar."
- 17/03/30 08:43:41 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/Users/james/spark-2.1.0-bin-hadoop2.7/jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/Users/james/spark/jars/datanucleus-core-3.2.10.jar."
- 17/03/30 08:43:41 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/Users/james/spark-2.1.0-bin-hadoop2.7/jars/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/Users/james/spark/jars/datanucleus-rdbms-3.2.9.jar."
- 17/03/30 08:43:41 ERROR Schema: Failed initialising database.
- Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
- at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
- at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 156 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 153 more
- ------
- org.datanucleus.exceptions.NucleusDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
- at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
- at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 156 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 153 more
- ------
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:516)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
- at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
- at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
- at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
- at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 156 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 153 more
- ------
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at com.jolbox.bonecp.PoolUtil.generateSQLException(PoolUtil.java:192)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:422)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- ... 138 more
- Caused by: java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- ... 140 more
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 156 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 153 more
- Nested Throwables StackTrace:
- java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
- at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
- at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 156 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 153 more
- ------
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at com.jolbox.bonecp.PoolUtil.generateSQLException(PoolUtil.java:192)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:422)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
- at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
- at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- ... 140 more
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 156 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 153 more
- 17/03/30 08:43:41 WARN HiveMetaStore: Retrying creating default database after error: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
- at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
- at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 156 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 153 more
- ------
- javax.jdo.JDOFatalDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
- at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
- at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 156 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 153 more
- ------
- at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:436)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
- at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
- at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- NestedThrowablesStackTrace:
- java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
- at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
- at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 156 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 153 more
- ------
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at com.jolbox.bonecp.PoolUtil.generateSQLException(PoolUtil.java:192)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:422)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
- at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
- at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- ... 140 more
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 156 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 153 more
- 17/03/30 08:43:41 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/Users/james/spark/jars/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/Users/james/spark-2.1.0-bin-hadoop2.7/jars/datanucleus-api-jdo-3.2.6.jar."
- 17/03/30 08:43:41 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/Users/james/spark-2.1.0-bin-hadoop2.7/jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/Users/james/spark/jars/datanucleus-core-3.2.10.jar."
- 17/03/30 08:43:41 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/Users/james/spark-2.1.0-bin-hadoop2.7/jars/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/Users/james/spark/jars/datanucleus-rdbms-3.2.9.jar."
- 17/03/30 08:43:41 ERROR Schema: Failed initialising database.
- Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
- at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
- at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 156 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 153 more
- ------
- org.datanucleus.exceptions.NucleusDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
- at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
- at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 156 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 153 more
- ------
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:516)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
- at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
- at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
- at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
- at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 156 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 153 more
- ------
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at com.jolbox.bonecp.PoolUtil.generateSQLException(PoolUtil.java:192)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:422)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- ... 138 more
- Caused by: java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- ... 140 more
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 156 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 153 more
- Nested Throwables StackTrace:
- java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
- at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
- at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 156 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 153 more
- ------
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at com.jolbox.bonecp.PoolUtil.generateSQLException(PoolUtil.java:192)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:422)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
- at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
- at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- ... 140 more
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 156 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 153 more
- 17/03/30 08:43:41 WARN Hive: Failed to access metastore. This class should not accessed in runtime.
- org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
- at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1236)
- at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
- at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
- ... 88 more
- Caused by: java.lang.reflect.InvocationTargetException
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- ... 94 more
- Caused by: javax.jdo.JDOFatalDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
- at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
- at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 156 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 153 more
- ------
- NestedThrowables:
- java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
- at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
- at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 156 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 153 more
- ------
- at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:436)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- ... 99 more
- Caused by: java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
- at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
- at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 156 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 153 more
- ------
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at com.jolbox.bonecp.PoolUtil.generateSQLException(PoolUtil.java:192)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:422)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- ... 128 more
- Caused by: java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- ... 140 more
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 156 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 153 more
- 17/03/30 08:43:41 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/Users/james/spark/jars/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/Users/james/spark-2.1.0-bin-hadoop2.7/jars/datanucleus-api-jdo-3.2.6.jar."
- 17/03/30 08:43:41 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/Users/james/spark-2.1.0-bin-hadoop2.7/jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/Users/james/spark/jars/datanucleus-core-3.2.10.jar."
- 17/03/30 08:43:41 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/Users/james/spark-2.1.0-bin-hadoop2.7/jars/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/Users/james/spark/jars/datanucleus-rdbms-3.2.9.jar."
- 17/03/30 08:43:41 ERROR Schema: Failed initialising database.
- Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 153 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 150 more
- ------
- org.datanucleus.exceptions.NucleusDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 153 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 150 more
- ------
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:516)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 153 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 150 more
- ------
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at com.jolbox.bonecp.PoolUtil.generateSQLException(PoolUtil.java:192)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:422)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- ... 135 more
- Caused by: java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- ... 137 more
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 153 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 150 more
- Nested Throwables StackTrace:
- java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 153 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 150 more
- ------
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at com.jolbox.bonecp.PoolUtil.generateSQLException(PoolUtil.java:192)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:422)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- ... 137 more
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 153 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 150 more
- 17/03/30 08:43:41 WARN HiveMetaStore: Retrying creating default database after error: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 153 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 150 more
- ------
- javax.jdo.JDOFatalDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 153 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 150 more
- ------
- at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:436)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- NestedThrowablesStackTrace:
- java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 153 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 150 more
- ------
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at com.jolbox.bonecp.PoolUtil.generateSQLException(PoolUtil.java:192)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:422)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- ... 137 more
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 153 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 150 more
- 17/03/30 08:43:41 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/Users/james/spark/jars/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/Users/james/spark-2.1.0-bin-hadoop2.7/jars/datanucleus-api-jdo-3.2.6.jar."
- 17/03/30 08:43:41 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/Users/james/spark-2.1.0-bin-hadoop2.7/jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/Users/james/spark/jars/datanucleus-core-3.2.10.jar."
- 17/03/30 08:43:41 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/Users/james/spark-2.1.0-bin-hadoop2.7/jars/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/Users/james/spark/jars/datanucleus-rdbms-3.2.9.jar."
- 17/03/30 08:43:41 ERROR Schema: Failed initialising database.
- Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 153 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 150 more
- ------
- org.datanucleus.exceptions.NucleusDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 153 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 150 more
- ------
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:516)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 153 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 150 more
- ------
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at com.jolbox.bonecp.PoolUtil.generateSQLException(PoolUtil.java:192)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:422)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- ... 135 more
- Caused by: java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- ... 137 more
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 153 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 150 more
- Nested Throwables StackTrace:
- java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 153 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 150 more
- ------
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at com.jolbox.bonecp.PoolUtil.generateSQLException(PoolUtil.java:192)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:422)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at $line3.$read$$iw$$iw.<init>(<console>:15)
- at $line3.$read$$iw.<init>(<console>:42)
- at $line3.$read.<init>(<console>:44)
- at $line3.$read$.<init>(<console>:48)
- at $line3.$read$.<clinit>(<console>)
- at $line3.$eval$.$print$lzycompute(<console>:7)
- at $line3.$eval$.$print(<console>:6)
- at $line3.$eval.$print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- ... 137 more
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 153 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 150 more
- java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- ... 47 elided
- Caused by: java.lang.reflect.InvocationTargetException: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- ... 58 more
- Caused by: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:169)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- ... 63 more
- Caused by: java.lang.reflect.InvocationTargetException: java.lang.reflect.InvocationTargetException: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- ... 71 more
- Caused by: java.lang.reflect.InvocationTargetException: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- ... 76 more
- Caused by: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- ... 84 more
- Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- ... 85 more
- Caused by: java.lang.reflect.InvocationTargetException: javax.jdo.JDOFatalDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at <init>(<console>:15)
- at <init>(<console>:42)
- at <init>(<console>:44)
- at .<init>(<console>:48)
- at .<clinit>(<console>)
- at .$print$lzycompute(<console>:7)
- at .$print(<console>:6)
- at $print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 153 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 150 more
- ------
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- ... 91 more
- Caused by: javax.jdo.JDOFatalDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at <init>(<console>:15)
- at <init>(<console>:42)
- at <init>(<console>:44)
- at .<init>(<console>:48)
- at .<clinit>(<console>)
- at .$print$lzycompute(<console>:7)
- at .$print(<console>:6)
- at $print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 153 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 150 more
- ------
- at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:436)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- ... 96 more
- Caused by: java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:derby:;databaseName=metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
- java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
- at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
- at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
- at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
- at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
- at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
- at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
- at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
- at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
- at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
- at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
- at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
- at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
- at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
- at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
- at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
- at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
- at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:192)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:366)
- at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:270)
- at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:65)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
- at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
- at scala.Option.getOrElse(Option.scala:121)
- at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
- at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
- at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
- at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
- at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
- at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
- at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
- at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
- at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
- at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
- at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
- at <init>(<console>:15)
- at <init>(<console>:42)
- at <init>(<console>:44)
- at .<init>(<console>:48)
- at .<clinit>(<console>)
- at .$print$lzycompute(<console>:7)
- at .$print(<console>:6)
- at $print(<console>)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
- at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
- at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
- at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
- at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
- at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
- at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
- at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
- at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
- at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
- at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
- at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
- at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
- at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
- at org.apache.spark.repl.Main$.doMain(Main.scala:68)
- at org.apache.spark.repl.Main$.main(Main.scala:51)
- at org.apache.spark.repl.Main.main(Main.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
- at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
- at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: ERROR XJ040: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 153 more
- Caused by: ERROR XSDB6: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 150 more
- ------
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at com.jolbox.bonecp.PoolUtil.generateSQLException(PoolUtil.java:192)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:422)
- at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120)
- at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:501)
- at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:298)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
- at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
- at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
- at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
- at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
- at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
- at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
- at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
- at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
- ... 125 more
- Caused by: java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
- at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.bootDatabase(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
- at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown Source)
- at java.sql.DriverManager.getConnection(DriverManager.java:664)
- at java.sql.DriverManager.getConnection(DriverManager.java:208)
- at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
- at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
- ... 137 more
- Caused by: org.apache.derby.iapi.error.StandardException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@53314f76, see the next exception for details.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown Source)
- ... 153 more
- Caused by: org.apache.derby.iapi.error.StandardException: Another instance of Derby may have already booted the database /Users/james/spark-2.1.0-bin-hadoop2.7/metastore_db.
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.privGetJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.getJBMSLockOnDB(Unknown Source)
- at org.apache.derby.impl.store.raw.data.BaseDataFileFactory.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore$6.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.raw.RawStore.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.raw.RawStore.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.store.access.RAMAccessManager.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.store.access.RAMAccessManager.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.FileMonitor.startModule(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase$5.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.db.BasicDatabase.bootServiceModule(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.bootStore(Unknown Source)
- at org.apache.derby.impl.db.BasicDatabase.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.boot(Unknown Source)
- at org.apache.derby.impl.services.monitor.TopService.bootModule(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startProviderService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.findProviderAndStartService(Unknown Source)
- at org.apache.derby.impl.services.monitor.BaseMonitor.startPersistentService(Unknown Source)
- at org.apache.derby.iapi.services.monitor.Monitor.startPersistentService(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at org.apache.derby.impl.jdbc.EmbedConnection$4.run(Unknown Source)
- at java.security.AccessController.doPrivileged(Native Method)
- at org.apache.derby.impl.jdbc.EmbedConnection.startPersistentService(Unknown Source)
- ... 150 more
- <console>:14: error: not found: value spark
- import spark.implicits._
- ^
- <console>:14: error: not found: value spark
- import spark.sql
- ^
- Welcome to
- ____ __
- / __/__ ___ _____/ /__
- _\ \/ _ \/ _ `/ __/ '_/
- /___/ .__/\_,_/_/ /_/\_\ version 2.1.0
- /_/
- Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_77)
- Type in expressions to have them evaluated.
- Type :help for more information.
- scala> :quit
- Script done on Thu Mar 30 08:43:45 2017
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement