Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HADOOP-18340. deleteOnExit does not work with S3AFileSystem #4608

Merged
merged 2 commits into from
Aug 11, 2022

Conversation

huaxiangsun
Copy link
Contributor

@huaxiangsun huaxiangsun commented Jul 21, 2022

Description of PR

processDeleteOnExit() is overiden in S3AFilesystem, it skips exist() check and delete objects without checking if FileSystem is closed.

How was this patch tested?

A new unitest case is added. And all unittest cases under hadoop-tools/hadoop-aws passed.
mvn -Dparallel-tests clean test

Did S3A Integration tests against us-west-2 region and there are a few failures/errors. Run the trunk code without the patch, there are same errors/failures. The errors/failures are not caused by the patch, probably due to misconfiguration ( I could not figure out)
mvn -Dparallel-tests clean verify

The result is
`

Tests Errors Failures Skipped Success Rate Time
1252 6 1 270 77.875% 3,627.473

The errors are
org.apache.hadoop.fs.s3a.auth.delegation.ITestDelegatedMRJob#testCommonCrawlLookup[1] + [ Detail ] | 0.324

  | s3a://hbase-test-data/fork-0001/test: getFileStatus on s3a://hbase-test-data/fork-0001/test: com.amazonaws.services.s3.model.AmazonS3Exception: The AWS Access Key Id you provided does not exist in our records. (Service: Amazon S3; Status Code: 403; Error Code: InvalidAccessKeyId; Request ID: XJ3DCCR6Q7SXTJDW; S3 Extended Request ID: fRmP3m1lThWxhj3s9VkSNEtuBz1JeBWYw65aRajrSg/H7IN+muB7d8PavSeqJ2urvLZtguTbnlc=; Proxy: null), S3 Extended Request ID: fRmP3m1lThWxhj3s9VkSNEtuBz1JeBWYw65aRajrSg/H7IN+muB7d8PavSeqJ2urvLZtguTbnlc=:InvalidAccessKeyId |  
  |   |  
  | testJobSubmissionCollectsTokens[1] + [ Detail ] | 0.329
  | s3a://hbase-test-data/fork-0001/test: getFileStatus on s3a://hbase-test-data/fork-0001/test: com.amazonaws.services.s3.model.AmazonS3Exception: The AWS Access Key Id you provided does not exist in our records. (Service: Amazon S3; Status Code: 403; Error Code: InvalidAccessKeyId; Request ID: XJ35WHK7X6EMP9B6; S3 Extended Request ID: rtWEsDYcGqNiaoKy2D5EQQqN+O7MbYe1bYbiSmkF+FOz9/wb6+t+dQooqj7ppCSCZMBgC3PeEw4=; Proxy: null), S3 Extended Request ID: rtWEsDYcGqNiaoKy2D5EQQqN+O7MbYe1bYbiSmkF+FOz9/wb6+t+dQooqj7ppCSCZMBgC3PeEw4=:InvalidAccessKeyId
`

`
org.apache.hadoop.fs.s3a.ITestS3AEndpointRegion#testBlankRegionTriggersSDKResolution + [ Detail ] | 2.817

  | [Client region name] expected:<"[mars-north]-2"> but was:<"[us-west]-2">
and
org.apache.hadoop.fs.s3a.ITestS3ATemporaryCredentials#estSTS 
  | : request session credentials: com.amazonaws.services.securitytoken.model.AWSSecurityTokenServiceException: Cannot call GetSessionToken with session credentials (Service: AWSSecurityTokenService; Status Code: 403; Error Code: AccessDenied; Request ID: d935696d-7d98-4a1c-825f-22bf3d28ed9d; Proxy: null):AccessDenied
`

For code changes:

  • [x ] Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')?
  • [x ] Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation?
  • If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under ASF 2.0?
  • If applicable, have you updated the LICENSE, LICENSE-binary, NOTICE-binary files?

@huaxiangsun
Copy link
Contributor Author

Will address the checkstyle issues.

@huaxiangsun
Copy link
Contributor Author

Hi @steveloughran, can you take a look at the patch? Thanks.

Copy link
Contributor

@steveloughran steveloughran left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

code in s3afs and the tests all look good.

But i think the changes should be restricted to s3a fs, even if duplicates a bit of the superclass.

That whole section in the javadocs at the top of the Filesystem. class explains why; you've already seen some of those test values but it's and external implementations which we don't have control of. Adding anything is making a commitment to preserve a new public API forever.

I will be happier if you were just do it all in S3AFileSystem.

override deleteOnExit(Path f) and

  1. skip the exists check because it doesn't do the right thing if you call the method while writing a file, because the file isn't visible until close. saves HEAD/List probes too
  2. save the list to a set local to s3a fs. you could make its getter protected so that mock test can access it.

now, what about an integration test too?

add a test case which creates a new fs in the test case (keeping the normal getFileSystem() fs for assertions)

  • adds a file which doesn't exist

  • adds a file which doesn't exist, then create it

  • adds a file which does exist

  • add a directory path

  • close the fs, use ContractTestUtils methods to assert the files and dirs are gone.

seem good?

@huaxiangsun
Copy link
Contributor Author

code in s3afs and the tests all look good.

But i think the changes should be restricted to s3a fs, even if duplicates a bit of the superclass.

That whole section in the javadocs at the top of the Filesystem. class explains why; you've already seen some of those test values but it's and external implementations which we don't have control of. Adding anything is making a commitment to preserve a new public API forever.

Thanks! Points well taken.

I will be happier if you were just do it all in S3AFileSystem.

override deleteOnExit(Path f) and

  1. skip the exists check because it doesn't do the right thing if you call the method while writing a file, because the file isn't visible until close. saves HEAD/List probes too
  2. save the list to a set local to s3a fs. you could make its getter protected so that mock test can access it.

Got it, for s3a fs, the deleteOnExit set is going to be in s3a.

now, what about an integration test too?

Will try to add an IT.

add a test case which creates a new fs in the test case (keeping the normal getFileSystem() fs for assertions)

  • adds a file which doesn't exist
  • adds a file which doesn't exist, then create it
  • adds a file which does exist
  • add a directory path
  • close the fs, use ContractTestUtils methods to assert the files and dirs are gone.

seem good?

Sounds good to me, thanks for the feedbacks. May come back to you for test case, going to check it out first by myself.

@steveloughran
Copy link
Contributor

yes; look at some of the other ITest cases in the s3a code.

if you aren't set up to run the hadoop-aws test suite, get set up to do that, as we require anyone submitting a pr to run the tests themselves first. getting them to work in your IDE is best for debugging.

@huaxiangsun
Copy link
Contributor Author

Thanks for info for ITtest for s3a. I did run IT for s3a before submitting the patch, but did not try to run in IDE yet. Let me try.

@huaxiangsun
Copy link
Contributor Author

huaxiangsun commented Aug 2, 2022

Thanks @steveloughran for the helps. I pushed a new patch, which I hope addresses your comments.

  1. Undo the changes in FileSystem.java, code is duplicated in S3AFileSystem.
  2. Incorporated some of your changes in HADOOP-13611
  3. Added a new s3a integration test case for deleteOnExit.

I have run all integration test under hadoop-aws. Here are the results.

Last Published: 2022-08-02 | Version: 3.4.0-SNAPSHOT

Failsafe Report

Summary

[Summary] [Package List] [Test Cases]


Tests Errors Failures Skipped Success Rate Time
1253 7 0 270 77.893% 3,301.515

For errored ones, they are same when I run with trunk branch. I think there is some misconfiguration in my environment. They are listed below.

testSTS + [ Detail ] 7.277
  Unable to find a region via the region provider chain. Must provide an explicit region in the builder or setup environment to supply a region.
   
  testSessionTokenExpiry + [ Detail ]
  Unable to find a region via the region provider chain. Must provide an explicit region in the builder or setup environment to supply a region.
   
   
  testSessionRequestExceptionTranslation + [ Detail ]
  Unable to find a region via the region provider chain. Must provide an explicit region in the builder or setup environment to supply a region.
   
   
  testSessionTokenPropagation + [ Detail ]
  Unable to find a region via the region provider chain. Must provide an explicit region in the builder or setup environment to supply a region.
   
   
  testInvalidSTSBinding + [ Detail ]
  Unable to find a region via the region provider chain. Must provide an explicit region in the builder or setup environment to supply a region.
 

testCommonCrawlLookup[1] + [ Detail ] 0.234
s3a://hbase-test-data/fork-0002/test: getFileStatus on s3a://hbase-test-data/fork-0002/test: com.amazonaws.services.s3.model.AmazonS3Exception: The AWS Access Key Id you provided does not exist in our records. (Service: Amazon S3; Status Code: 403; Error Code: InvalidAccessKeyId; Request ID: ZANXYYBWFXQN5C9D; S3 Extended Request ID: Se9ShKMS7T2H4oZb6kFEl+gUJFOIV9fgmnrzISp5NZi5QlbSLQQZdt6iRVbpKpRewK32iPominY=; Proxy: null), S3 Extended Request ID: Se9ShKMS7T2H4oZb6kFEl+gUJFOIV9fgmnrzISp5NZi5QlbSLQQZdt6iRVbpKpRewK32iPominY=:InvalidAccessKeyId

testJobSubmissionCollectsTokens[1] + [ Detail ] 0.218
s3a://hbase-test-data/fork-0002/test: getFileStatus on s3a://hbase-test-data/fork-0002/test: com.amazonaws.services.s3.model.AmazonS3Exception: The AWS Access Key Id you provided does not exist in our records. (Service: Amazon S3; Status Code: 403; Error Code: InvalidAccessKeyId; Request ID: ZANK83F5X125CJB4; S3 Extended Request ID: EaB07gY+S8pH7WObAt7xPaqdX83XDtbX61qmDnoEsVyMLbC9IiDuyx5yKZpRS4q/Tkgjtcx6qzs=; Proxy: null), S3 Extended Request ID: EaB07gY+S8pH7WObAt7xPaqdX83XDtbX61qmDnoEsVyMLbC9IiDuyx5yKZpRS4q/Tkgjtcx6qzs=:InvalidAccessKeyId

@huaxiangsun
Copy link
Contributor Author

I also run all unitest cases under hadoop-aws. They passed.
INFO] Results:
[INFO]
[WARNING] Tests run: 407, Failures: 0, Errors: 0, Skipped: 4
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:37 min
[INFO] Finished at: 2022-08-02T10:29:52-07:00
[INFO] ------------------------------------------------------------------------

@huaxiangsun
Copy link
Contributor Author

@steveloughran Sorry to bother you while you are busy, just a gentle reminder for the patch I posted couple days ago. Appreciate that you can review the patch when you get chance, thanks.

@steveloughran
Copy link
Contributor

yes, sorry, been neglecting this

Copy link
Contributor

@steveloughran steveloughran left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

all the production code is good; some minor changes to the tests and its ready for the next hadoop release

@huaxiangsun huaxiangsun closed this Aug 9, 2022
@huaxiangsun huaxiangsun reopened this Aug 9, 2022
@huaxiangsun
Copy link
Contributor Author

I also run the new IT test without change in S3AFileSystem, it failed as expected.

@hadoop-yetus
Copy link

🎊 +1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 53s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 2 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 43m 21s trunk passed
+1 💚 compile 0m 43s trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1
+1 💚 compile 0m 39s trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07
+1 💚 checkstyle 0m 35s trunk passed
+1 💚 mvnsite 0m 45s trunk passed
+1 💚 javadoc 0m 32s trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1
+1 💚 javadoc 0m 32s trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07
+1 💚 spotbugs 1m 19s trunk passed
+1 💚 shadedclient 21m 13s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 0m 31s the patch passed
+1 💚 compile 0m 36s the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1
+1 💚 javac 0m 36s the patch passed
+1 💚 compile 0m 29s the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07
+1 💚 javac 0m 29s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 0m 19s the patch passed
+1 💚 mvnsite 0m 35s the patch passed
+1 💚 javadoc 0m 16s the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1
+1 💚 javadoc 0m 24s the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07
+1 💚 spotbugs 1m 5s the patch passed
+1 💚 shadedclient 19m 41s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 2m 29s hadoop-aws in the patch passed.
+1 💚 asflicense 0m 38s The patch does not generate ASF License warnings.
98m 37s
Subsystem Report/Notes
Docker ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4608/7/artifact/out/Dockerfile
GITHUB PR #4608
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux 98c795ea550b 4.15.0-169-generic #177-Ubuntu SMP Thu Feb 3 10:50:38 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / fb2ae77
Default Java Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4608/7/testReport/
Max. process+thread count 554 (vs. ulimit of 5500)
modules C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4608/7/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@hadoop-yetus
Copy link

🎊 +1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 1m 8s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 1s codespell was not available.
+0 🆗 detsecrets 0m 1s detect-secrets was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 2 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 41m 57s trunk passed
+1 💚 compile 0m 58s trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1
+1 💚 compile 0m 49s trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07
+1 💚 checkstyle 0m 41s trunk passed
+1 💚 mvnsite 0m 53s trunk passed
+1 💚 javadoc 0m 37s trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1
+1 💚 javadoc 0m 39s trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07
+1 💚 spotbugs 1m 39s trunk passed
+1 💚 shadedclient 24m 33s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 0m 38s the patch passed
+1 💚 compile 0m 45s the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1
+1 💚 javac 0m 45s the patch passed
+1 💚 compile 0m 33s the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07
+1 💚 javac 0m 32s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 0m 22s the patch passed
+1 💚 mvnsite 0m 39s the patch passed
+1 💚 javadoc 0m 19s the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1
+1 💚 javadoc 0m 31s the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07
+1 💚 spotbugs 1m 23s the patch passed
+1 💚 shadedclient 24m 6s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 2m 47s hadoop-aws in the patch passed.
+1 💚 asflicense 0m 44s The patch does not generate ASF License warnings.
107m 8s
Subsystem Report/Notes
Docker ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4608/6/artifact/out/Dockerfile
GITHUB PR #4608
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux 107e0a86a207 4.15.0-166-generic #174-Ubuntu SMP Wed Dec 8 19:07:44 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / fb2ae77
Default Java Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4608/6/testReport/
Max. process+thread count 597 (vs. ulimit of 5500)
modules C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4608/6/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@hadoop-yetus
Copy link

🎊 +1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 1m 6s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 1s codespell was not available.
+0 🆗 detsecrets 0m 1s detect-secrets was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
+1 💚 test4tests 0m 0s The patch appears to include 2 new or modified test files.
_ trunk Compile Tests _
+1 💚 mvninstall 44m 2s trunk passed
+1 💚 compile 1m 4s trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1
+1 💚 compile 0m 53s trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07
+1 💚 checkstyle 0m 49s trunk passed
+1 💚 mvnsite 1m 3s trunk passed
+1 💚 javadoc 0m 44s trunk passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1
+1 💚 javadoc 0m 49s trunk passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07
+1 💚 spotbugs 1m 33s trunk passed
+1 💚 shadedclient 27m 19s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 0m 42s the patch passed
+1 💚 compile 0m 43s the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1
+1 💚 javac 0m 43s the patch passed
+1 💚 compile 0m 36s the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07
+1 💚 javac 0m 36s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 0m 27s the patch passed
+1 💚 mvnsite 0m 44s the patch passed
+1 💚 javadoc 0m 22s the patch passed with JDK Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1
+1 💚 javadoc 0m 33s the patch passed with JDK Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07
+1 💚 spotbugs 1m 22s the patch passed
+1 💚 shadedclient 26m 36s patch has no errors when building and testing our client artifacts.
_ Other Tests _
+1 💚 unit 2m 59s hadoop-aws in the patch passed.
+1 💚 asflicense 0m 53s The patch does not generate ASF License warnings.
116m 23s
Subsystem Report/Notes
Docker ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4608/8/artifact/out/Dockerfile
GITHUB PR #4608
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets
uname Linux 02e1a93d5d2e 4.15.0-175-generic #184-Ubuntu SMP Thu Mar 24 17:48:36 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / fb2ae77
Default Java Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07
Multi-JDK versions /usr/lib/jvm/java-11-openjdk-amd64:Private Build-11.0.15+10-Ubuntu-0ubuntu0.20.04.1 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_312-8u312-b07-0ubuntu1~20.04-b07
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4608/8/testReport/
Max. process+thread count 607 (vs. ulimit of 5500)
modules C: hadoop-tools/hadoop-aws U: hadoop-tools/hadoop-aws
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-4608/8/console
versions git=2.25.1 maven=3.6.3 spotbugs=4.2.2
Powered by Apache Yetus 0.14.0 https://yetus.apache.org

This message was automatically generated.

@apache apache deleted a comment from hadoop-yetus Aug 11, 2022
@apache apache deleted a comment from hadoop-yetus Aug 11, 2022
@apache apache deleted a comment from hadoop-yetus Aug 11, 2022
Copy link
Contributor

@steveloughran steveloughran left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1. merging to trunk. will backport with a retest. patches which only touch hadoop-aws are always less risky than anything which changes hadoop-common, but i should test the changes before pushing to branch-3.3

@apache apache deleted a comment from hadoop-yetus Aug 11, 2022
@apache apache deleted a comment from hadoop-yetus Aug 11, 2022
@steveloughran steveloughran changed the title HADOOP-18340 deleteOnExit does not work with S3AFileSystem HADOOP-18340. deleteOnExit does not work with S3AFileSystem Aug 11, 2022
@steveloughran steveloughran merged commit e9509ac into apache:trunk Aug 11, 2022
asfgit pushed a commit that referenced this pull request Aug 11, 2022
@huaxiangsun
Copy link
Contributor Author

Thanks @steveloughran for the review, really nice comments!

HarshitGupta11 pushed a commit to HarshitGupta11/hadoop that referenced this pull request Nov 28, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
3 participants