-
Notifications
You must be signed in to change notification settings - Fork 2.5k
Closed
Description
Trying to run Hudi CLI against minio min.io buckets.
To Reproduce
Steps to reproduce the behavior:
export AWS_REGION=us-east-1
export AWS_ACCESS_KEY_ID=admin
export AWS_SECRET_ACCESS_KEY=password
export ENDPOINT=http://minio:9000
export AWS_ENDPOINT=http://minio:9000
export AWS_S3_ENDPOINT=http://minio:9000
Expected behavior
it should work however you get
10868 [main] INFO org.apache.hudi.common.table.HoodieTableMetaClient [] - Loading HoodieTableMetaClient from s3a://warehouse/stock_ticks_mor
Could not check if s3a://warehouse/stock_ticks_mor is a valid table
Details of the error have been omitted. You can use the stacktrace command to print the full stacktrace.
hudi->stacktrace
org.apache.hudi.exception.HoodieIOException: Could not check if s3a://warehouse/stock_ticks_mor is a valid table
at org.apache.hudi.exception.TableNotFoundException.checkTableValidity(TableNotFoundException.java:59)
at org.apache.hudi.common.table.HoodieTableMetaClient.<init>(HoodieTableMetaClient.java:135)
at org.apache.hudi.common.table.HoodieTableMetaClient.newMetaClient(HoodieTableMetaClient.java:680)
at org.apache.hudi.common.table.HoodieTableMetaClient.access$100(HoodieTableMetaClient.java:81)
at org.apache.hudi.common.table.HoodieTableMetaClient$Builder.build(HoodieTableMetaClient.java:772)
at org.apache.hudi.cli.HoodieCLI.refreshTableMetadata(HoodieCLI.java:97)
at org.apache.hudi.cli.HoodieCLI.connectTo(HoodieCLI.java:103)
at org.apache.hudi.cli.commands.TableCommand.connect(TableCommand.java:86)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.base/java.lang.reflect.Method.invoke(Unknown Source)
at org.springframework.shell.command.invocation.InvocableShellMethod.doInvoke(InvocableShellMethod.java:306)
at org.springframework.shell.command.invocation.InvocableShellMethod.invoke(InvocableShellMethod.java:232)
at org.springframework.shell.command.CommandExecution$DefaultCommandExecution.evaluate(CommandExecution.java:158)
at org.springframework.shell.Shell.evaluate(Shell.java:208)
at org.springframework.shell.Shell.run(Shell.java:140)
at org.springframework.shell.jline.InteractiveShellRunner.run(InteractiveShellRunner.java:73)
at org.springframework.shell.DefaultShellApplicationRunner.run(DefaultShellApplicationRunner.java:65)
at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:762)
at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:752)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:315)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1306)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1295)
at org.apache.hudi.cli.Main.main(Main.java:34)
Caused by: java.nio.file.AccessDeniedException: s3a://warehouse/stock_ticks_mor/.hoodie: getFileStatus on s3a://warehouse/stock_ticks_mor/.hoodie: com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: J0G2QGWJE75WXRXT; S3 Extended Request ID: 7vzCLPHe5FKvbrQQGSsQXX4zfwo7/4K2xUScSYW8zcQoa8Oo42q5/QL9bP41ELcbA3vywO3P1v6IpQBXGWVouA==; Proxy: null), S3 Extended Request ID: 7vzCLPHe5FKvbrQQGSsQXX4zfwo7/4K2xUScSYW8zcQoa8Oo42q5/QL9bP41ELcbA3vywO3P1v6IpQBXGWVouA==:403 Forbidden
at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:255)
at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:175)
at org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:3796)
at org.apache.hadoop.fs.s3a.S3AFileSystem.innerGetFileStatus(S3AFileSystem.java:3688)
at org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$getFileStatus$24(S3AFileSystem.java:3556)
at org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.lambda$trackDurationOfOperation$5(IOStatisticsBinding.java:499)
at org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.trackDuration(IOStatisticsBinding.java:444)
at org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2337)
at org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2356)
at org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:3554)
at org.apache.hudi.hadoop.fs.HoodieWrapperFileSystem.lambda$getFileStatus$17(HoodieWrapperFileSystem.java:414)
at org.apache.hudi.hadoop.fs.HoodieWrapperFileSystem.executeFuncWithTimeMetrics(HoodieWrapperFileSystem.java:118)
at org.apache.hudi.hadoop.fs.HoodieWrapperFileSystem.getFileStatus(HoodieWrapperFileSystem.java:408)
at org.apache.hudi.storage.hadoop.HoodieHadoopStorage.getPathInfo(HoodieHadoopStorage.java:169)
at org.apache.hudi.exception.TableNotFoundException.checkTableValidity(TableNotFoundException.java:51)
... 24 more
Caused by: com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: J0G2QGWJE75WXRXT; S3 Extended Request ID: 7vzCLPHe5FKvbrQQGSsQXX4zfwo7/4K2xUScSYW8zcQoa8Oo42q5/QL9bP41ELcbA3vywO3P1v6IpQBXGWVouA==; Proxy: null), S3 Extended Request ID: 7vzCLPHe5FKvbrQQGSsQXX4zfwo7/4K2xUScSYW8zcQoa8Oo42q5/QL9bP41ELcbA3vywO3P1v6IpQBXGWVouA==
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1880)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleServiceErrorResponse(AmazonHttpClient.java:1418)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1387)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1157)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:814)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:781)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:755)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:715)
at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:697)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:561)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:541)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5575)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5522)
at com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1417)
at org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$getObjectMetadata$10(S3AFileSystem.java:2545)
at org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:414)
at org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:377)
at org.apache.hadoop.fs.s3a.S3AFileSystem.getObjectMetadata(S3AFileSystem.java:2533)
at org.apache.hadoop.fs.s3a.S3AFileSystem.getObjectMetadata(S3AFileSystem.java:2513)
at org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:3776)
Environment Description
-
Hudi version : 0.15
-
Spark version : 3.4
-
Hive version : 2.3.9
-
Hadoop version : 3.4
-
Storage (HDFS/S3/GCS..) : Minio S3
-
Running on Docker? (yes/no) : yes
Additional context
Add any other context about the problem here.
Stacktrace
Add the stacktrace of the error.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels