Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HBASE-27754 generateMissingTableDescriptorFile should throw write per… #114

Merged
merged 1 commit into from Mar 28, 2023

Conversation

NihalJain
Copy link
Contributor

…mission error and fail

@NihalJain
Copy link
Contributor Author

Before patch:

2023-03-24T19:03:16,890 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (199657303) connection to hostname/ip_address:port_num from root sending #31 org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
2023-03-24T19:03:16,893 DEBUG [IPC Client (199657303) connection to hostname/ip_address:port_num from root] ipc.Client: IPC Client (199657303) connection to hostname/ip_address:port_num from root got value #31
2023-03-24T19:03:16,894 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 4ms
2023-03-24T19:03:16,894 DEBUG [main] hdfs.DFSClient: /apps/hbase/data/data/default/ittable-2090120905/.tmp/.tableinfo.0000000010: masked={ masked: rw-r--r--, unmasked: rw-rw-rw- }
2023-03-24T19:03:16,895 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (199657303) connection to hostname/ip_address:port_num from root sending #32 org.apache.hadoop.hdfs.protocol.ClientProtocol.create
2023-03-24T19:03:16,897 DEBUG [IPC Client (199657303) connection to hostname/ip_address:port_num from root] ipc.Client: IPC Client (199657303) connection to hostname/ip_address:port_num from root got value #32
2023-03-24T19:03:16,898 DEBUG [main] retry.RetryInvocationHandler: Exception while invoking call #32 ClientNamenodeProtocolTranslatorPB.create over null. Not retrying because try once and fail.
org.apache.hadoop.ipc.RemoteException: Permission denied: user=root, access=WRITE, inode="/apps/hbase/data/data/default/ittable-2090120905/.tmp":hdfs:hdfs:drwxr-xr-x
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:255)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1896)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1880)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1839)
	at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.resolvePathForStartFile(FSDirWriteFileOp.java:323)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2513)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2457)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:791)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:478)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:528)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1086)
	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1031)
	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:959)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2963)

	at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1587) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.ipc.Client.call(Client.java:1533) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.ipc.Client.call(Client.java:1430) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) ~[hadoop-common-hadoop_version.jar:?]
	at com.sun.proxy.$Proxy25.create(Unknown Source) ~[?:?]
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:372) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:java_version]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:java_version]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:java_version]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:java_version]
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) ~[hadoop-common-hadoop_version.jar:?]
	at com.sun.proxy.$Proxy26.create(Unknown Source) ~[?:?]
	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:276) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1222) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1201) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1139) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:534) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:531) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:545) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:472) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1125) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1105) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:994) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hbase.HBCKFsTableDescriptors.writeTD(HBCKFsTableDescriptors.java:391) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
	at org.apache.hbase.HBCKFsTableDescriptors.writeTableDescriptor(HBCKFsTableDescriptors.java:365) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
	at org.apache.hbase.HBCKFsTableDescriptors.createTableDescriptorForTableDirectory(HBCKFsTableDescriptors.java:439) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
	at org.apache.hbase.HBCKFsTableDescriptors.createTableDescriptor(HBCKFsTableDescriptors.java:411) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
	at org.apache.hbase.MissingTableDescriptorGenerator.generateTableDescriptorFileIfMissing(MissingTableDescriptorGenerator.java:93) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
	at org.apache.hbase.HBCK2.doCommandLine(HBCK2.java:1034) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
	at org.apache.hbase.HBCK2.run(HBCK2.java:830) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hbase.HBCK2.main(HBCK2.java:1145) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
2023-03-24T19:03:16,902 DEBUG [main] hbase.HBCKFsTableDescriptors: Failed write and/or rename; retrying
org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/apps/hbase/data/data/default/ittable-2090120905/.tmp":hdfs:hdfs:drwxr-xr-x
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:255)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1896)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1880)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1839)
	at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.resolvePathForStartFile(FSDirWriteFileOp.java:323)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2513)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2457)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:791)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:478)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:528)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1086)
	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1031)
	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:959)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2963)

	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:java_version]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:java_version]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:java_version]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:java_version]
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:121) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:88) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:281) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1222) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1201) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1139) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:534) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:531) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:545) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:472) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1125) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1105) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:994) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hbase.HBCKFsTableDescriptors.writeTD(HBCKFsTableDescriptors.java:391) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
	at org.apache.hbase.HBCKFsTableDescriptors.writeTableDescriptor(HBCKFsTableDescriptors.java:365) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
	at org.apache.hbase.HBCKFsTableDescriptors.createTableDescriptorForTableDirectory(HBCKFsTableDescriptors.java:439) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
	at org.apache.hbase.HBCKFsTableDescriptors.createTableDescriptor(HBCKFsTableDescriptors.java:411) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
	at org.apache.hbase.MissingTableDescriptorGenerator.generateTableDescriptorFileIfMissing(MissingTableDescriptorGenerator.java:93) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
	at org.apache.hbase.HBCK2.doCommandLine(HBCK2.java:1034) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
	at org.apache.hbase.HBCK2.run(HBCK2.java:830) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hbase.HBCK2.main(HBCK2.java:1145) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
Caused by: org.apache.hadoop.ipc.RemoteException: Permission denied: user=root, access=WRITE, inode="/apps/hbase/data/data/default/ittable-2090120905/.tmp":hdfs:hdfs:drwxr-xr-x
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:255)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1896)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1880)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1839)
	at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.resolvePathForStartFile(FSDirWriteFileOp.java:323)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2513)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2457)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:791)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:478)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:528)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1086)
	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1031)
	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:959)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2963)

	at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1587) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.ipc.Client.call(Client.java:1533) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.ipc.Client.call(Client.java:1430) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) ~[hadoop-common-hadoop_version.jar:?]
	at com.sun.proxy.$Proxy25.create(Unknown Source) ~[?:?]
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:372) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:java_version]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:java_version]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:java_version]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:java_version]
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) ~[hadoop-common-hadoop_version.jar:?]
	at com.sun.proxy.$Proxy26.create(Unknown Source) ~[?:?]
	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:276) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	... 21 more
2023-03-24T19:03:16,907 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (199657303) connection to hostname/ip_address:port_num from root sending #33 org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
2023-03-24T19:03:16,908 DEBUG [IPC Client (199657303) connection to hostname/ip_address:port_num from root] ipc.Client: IPC Client (199657303) connection to hostname/ip_address:port_num from root got value #33
2023-03-24T19:03:16,908 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 2ms
2023-03-24T19:03:16,908 WARN  [main] hbase.HBCKFsTableDescriptors: Failed cleanup of hdfs://hostname:port_num/apps/hbase/data/data/default/ittable-2090120905/.tmp/.tableinfo.0000000010
2023-03-24T19:03:16,909 INFO  [main] hbase.MissingTableDescriptorGenerator: Table descriptor written successfully. Orphan table ittable-2090120905 fixed.

After patch:

2023-03-27T15:45:43,582 INFO  [main] hbase.MissingTableDescriptorGenerator: Table descriptor found in the cache of HBase Master, writing it to the file system.
2023-03-27T15:45:43,666 ERROR [main] hbase.MissingTableDescriptorGenerator: Exception while writing the table descriptor to the file system for table ittable-2090120905
org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/apps/hbase/data/data/default/ittable-2090120905/.tmp":hdfs:hdfs:drwxr-xr-x
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:255)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1896)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1880)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1839)
	at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.resolvePathForStartFile(FSDirWriteFileOp.java:323)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2513)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2457)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:791)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:478)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:528)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1086)
	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1031)
	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:959)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2963)

	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_352]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_352]
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_352]
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_352]
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:121) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:88) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:281) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1222) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1201) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1139) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:534) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:531) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:545) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:472) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1125) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1105) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:994) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hbase.HBCKFsTableDescriptors.writeTD(HBCKFsTableDescriptors.java:395) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
	at org.apache.hbase.HBCKFsTableDescriptors.writeTableDescriptor(HBCKFsTableDescriptors.java:366) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
	at org.apache.hbase.HBCKFsTableDescriptors.createTableDescriptorForTableDirectory(HBCKFsTableDescriptors.java:443) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
	at org.apache.hbase.HBCKFsTableDescriptors.createTableDescriptor(HBCKFsTableDescriptors.java:415) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
	at org.apache.hbase.MissingTableDescriptorGenerator.generateTableDescriptorFileIfMissing(MissingTableDescriptorGenerator.java:93) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
	at org.apache.hbase.HBCK2.doCommandLine(HBCK2.java:1034) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
	at org.apache.hbase.HBCK2.run(HBCK2.java:830) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hbase.HBCK2.main(HBCK2.java:1145) ~[hbase-hbck2-hbase_op_tools_version.jar:hbase_op_tools_version]
Caused by: org.apache.hadoop.ipc.RemoteException: Permission denied: user=root, access=WRITE, inode="/apps/hbase/data/data/default/ittable-2090120905/.tmp":hdfs:hdfs:drwxr-xr-x
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:255)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1896)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1880)
	at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1839)
	at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.resolvePathForStartFile(FSDirWriteFileOp.java:323)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2513)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2457)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:791)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:478)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:528)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1086)
	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1031)
	at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:959)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2963)

	at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1587) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.ipc.Client.call(Client.java:1533) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.ipc.Client.call(Client.java:1430) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) ~[hadoop-common-hadoop_version.jar:?]
	at com.sun.proxy.$Proxy25.create(Unknown Source) ~[?:?]
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:372) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_352]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_352]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_352]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_352]
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) ~[hadoop-common-hadoop_version.jar:?]
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) ~[hadoop-common-hadoop_version.jar:?]
	at com.sun.proxy.$Proxy26.create(Unknown Source) ~[?:?]
	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:276) ~[hadoop-hdfs-client-hadoop_version.jar:?]
	... 21 more
2023-03-27T15:45:43,687 INFO  [ReadOnlyZKClient-hostname:ip_address:2181@0x68ed96ca] zookeeper.ZooKeeper: Session: 0x301d99e41bd0b65 closed
2023-03-27T15:45:43,687 INFO  [ReadOnlyZKClient-hostname:ip_address:2181@0x68ed96ca-EventThread] zookeeper.ClientCnxn: EventThread shut down for session: 0x301d99e41bd0b65

@NihalJain
Copy link
Contributor Author

Please review @ndimiduk , @petersomogyi , @wchevreuil

@Apache-HBase
Copy link

🎊 +1 overall

Vote Subsystem Runtime Comment
+0 🆗 reexec 1m 22s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 spotbugs 0m 0s spotbugs executables are not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
-0 ⚠️ test4tests 0m 0s The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch.
_ master Compile Tests _
+1 💚 mvninstall 0m 48s master passed
+1 💚 compile 0m 13s master passed
+1 💚 checkstyle 0m 9s master passed
+1 💚 javadoc 0m 9s master passed
_ Patch Compile Tests _
+1 💚 mvninstall 0m 14s the patch passed
+1 💚 compile 0m 12s the patch passed
+1 💚 javac 0m 12s the patch passed
+1 💚 checkstyle 0m 5s the patch passed
+1 💚 whitespace 0m 0s The patch has no whitespace issues.
+1 💚 javadoc 0m 6s the patch passed
_ Other Tests _
+1 💚 unit 5m 5s hbase-hbck2 in the patch passed.
+1 💚 asflicense 0m 8s The patch does not generate ASF License warnings.
8m 44s
Subsystem Report/Notes
Docker ClientAPI=1.42 ServerAPI=1.42 base: https://ci-hbase.apache.org/job/HBase-Operator-Tools-PreCommit/job/PR-114/1/artifact/yetus-precommit-check/output/Dockerfile
GITHUB PR #114
Optional Tests dupname asflicense javac javadoc unit spotbugs findbugs checkstyle compile
uname Linux 931e684e8a3b 5.4.0-1094-aws #102~18.04.1-Ubuntu SMP Tue Jan 10 21:07:03 UTC 2023 x86_64 GNU/Linux
Build tool maven
git revision master / b9b605b
Default Java Oracle Corporation-1.8.0_342-b07
Test Results https://ci-hbase.apache.org/job/HBase-Operator-Tools-PreCommit/job/PR-114/1/testReport/
Max. process+thread count 1264 (vs. ulimit of 5000)
modules C: hbase-hbck2 U: hbase-hbck2
Console output https://ci-hbase.apache.org/job/HBase-Operator-Tools-PreCommit/job/PR-114/1/console
versions git=2.30.2 maven=3.8.6
Powered by Apache Yetus 0.12.0 https://yetus.apache.org

This message was automatically generated.

@Apache-HBase
Copy link

🎊 +1 overall

Vote Subsystem Runtime Comment
+0 🆗 reexec 0m 44s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 spotbugs 0m 0s spotbugs executables are not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
-0 ⚠️ test4tests 0m 0s The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch.
_ master Compile Tests _
+1 💚 mvninstall 0m 52s master passed
+1 💚 compile 0m 13s master passed
+1 💚 checkstyle 0m 10s master passed
+1 💚 javadoc 0m 9s master passed
_ Patch Compile Tests _
+1 💚 mvninstall 0m 15s the patch passed
+1 💚 compile 0m 14s the patch passed
+1 💚 javac 0m 14s the patch passed
+1 💚 checkstyle 0m 8s the patch passed
+1 💚 whitespace 0m 0s The patch has no whitespace issues.
+1 💚 javadoc 0m 7s the patch passed
_ Other Tests _
+1 💚 unit 5m 1s hbase-hbck2 in the patch passed.
+1 💚 asflicense 0m 7s The patch does not generate ASF License warnings.
8m 10s
Subsystem Report/Notes
Docker ClientAPI=1.42 ServerAPI=1.42 base: https://ci-hbase.apache.org/job/HBase-Operator-Tools-PreCommit/job/PR-114/2/artifact/yetus-precommit-check/output/Dockerfile
GITHUB PR #114
Optional Tests dupname asflicense javac javadoc unit spotbugs findbugs checkstyle compile
uname Linux 22a5ee8f7e31 5.4.0-1094-aws #102~18.04.1-Ubuntu SMP Tue Jan 10 21:07:03 UTC 2023 x86_64 GNU/Linux
Build tool maven
git revision master / b9b605b
Default Java Oracle Corporation-1.8.0_342-b07
Test Results https://ci-hbase.apache.org/job/HBase-Operator-Tools-PreCommit/job/PR-114/2/testReport/
Max. process+thread count 1253 (vs. ulimit of 5000)
modules C: hbase-hbck2 U: hbase-hbck2
Console output https://ci-hbase.apache.org/job/HBase-Operator-Tools-PreCommit/job/PR-114/2/console
versions git=2.30.2 maven=3.8.6
Powered by Apache Yetus 0.12.0 https://yetus.apache.org

This message was automatically generated.

@chrajeshbabu chrajeshbabu merged commit 26fc7e5 into apache:master Mar 28, 2023
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
3 participants