Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- hive> show databases;
- OK
- default
- Time taken: 0.799 seconds, Fetched: 1 row(s)
- hive> show tables;
- OK
- Time taken: 0.027 seconds
- hive> create table test1(id int, name string);
- OK
- Time taken: 0.928 seconds
- hive> show tables;
- OK
- test1
- Time taken: 0.021 seconds, Fetched: 1 row(s)
- hive>
- 0: jdbc:hive2://localhost:10000> create table test2 (id int, name string);
- Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException Permission denied: user=hive2, access=WRITE, inode="/user/hive/warehouse/test2":server:supergroup:drwxrwxr-x
- at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
- at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292)
- at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213)
- at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
- at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1728)
- at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1712)
- at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1695)
- at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71)
- at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3896)
- at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:984)
- at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:622)
- at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
- at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
- at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
- at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
- at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.security.auth.Subject.doAs(Subject.java:422)
- at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
- at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)
- ) (state=08S01,code=1)
- 0: jdbc:hive2://localhost:10000> show tables;
- +-----------+--+
- | tab_name |
- +-----------+--+
- | test1 |
- +-----------+--+
- java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException Permission denied: user=hive2, access=WRITE, inode="/user/hive/warehouse/test2":server:supergroup:drwxrwxr-x
- Beeline version 2.1.1 by Apache Hive
- beeline> !connect jdbc:hive2://localhost:10000
- Connecting to jdbc:hive2://localhost:10000
- Enter username for jdbc:hive2://localhost:10000: hive2
- Enter password for jdbc:hive2://localhost:10000: ********
- try(
- Connection con = DriverManager.getConnection("jdbc:hive2://localhost:10000/default", "hive2", "password");
- Statement stmt = con.createStatement();) {
- ...
- }
- <property>
- <name>hive.exec.stagingdir</name>
- <value>/tmp/hive-staging</value>
- </property>
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement