You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
With Hive by default tables are created under the database location if the table location is not specified.
If the database location is s3://YOURBUCKET/YOURFOLDER/mydb/ and I try to create a table as follow:
CREATE TABLE parquet_db_1.test1 (
a int,
b int)
STORED AS PARQUET;
the table location is automatically setup as s3://YOURBUCKET/YOURFOLDER/mydb/test1/.
When we configure the S3StorageBasedAuthorizationProvider for Hive we can see that it is not possible to run create table statements without a location specified. Example:
CREATE TABLE mydb.test1 (
a int,
b int)
STORED AS PARQUET;
Also this DDL fails
USE mydb;
CREATE TABLE test3 (
a int,
b int)
STORED AS PARQUET;
The error is:
2023-04-07T07:44:59,533 ERROR [pool-7-thread-1([])]: metastore.RetryingHMSHandler (RetryingHMSHandler.java:invokeInternal(200)) - MetaException(message:java.lang.NullPointerException)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newMetaException(HiveMetaStore.java:6200)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1516)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
at com.sun.proxy.$Proxy24.create_table_with_environment_context(Unknown Source)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$create_table_with_environment_context.getResult(ThriftHiveMetastore.java:11257) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$create_table_with_environment_context.getResult(ThriftHiveMetastore.java:11241)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor$1.run(HadoopThriftAuthBridge.java:594)
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor$1.run(HadoopThriftAuthBridge.java:589)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1926)
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge.java:589)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.lang.NullPointerException
at com.amazonaws.emr.urm.hive.urmstoragebasedauthorizer.S3StorageBasedAuthorizationProvider.authorize(S3StorageBasedAuthorizationProvider.java:118)
at org.apache.hadoop.hive.ql.security.authorization.AuthorizationPreEventListener.authorizeCreateTable(AuthorizationPreEventListener.java:265)
at org.apache.hadoop.hive.ql.security.authorization.AuthorizationPreEventListener.onEvent(AuthorizationPreEventListener.java:140) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.firePreEvent(HiveMetaStore.java:2281)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1404) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1502)
... 21 more
The text was updated successfully, but these errors were encountered:
With Hive by default tables are created under the database location if the table location is not specified.
If the database location is s3://YOURBUCKET/YOURFOLDER/mydb/ and I try to create a table as follow:
the table location is automatically setup as s3://YOURBUCKET/YOURFOLDER/mydb/test1/.
When we configure the S3StorageBasedAuthorizationProvider for Hive we can see that it is not possible to run create table statements without a location specified. Example:
If we have this database:
This DDL fails:
Also this DDL fails
The error is:
The text was updated successfully, but these errors were encountered: