Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix S3 handling of special character and trailing slash #9143

Merged
merged 2 commits into from Sep 14, 2023

Conversation

bentsku
Copy link
Contributor

@bentsku bentsku commented Sep 13, 2023

Motivation

This is a follow up from the Werkzeug upgrade, and the moto bump adapting to those changes, following the reported issue #9112

Moto now has a specific handling of special character to maintain compatibility between mocked botocore requests and werkzeuga requests. We've had issues when accessing directly moto's backend to get the key, with the proper key from the request.

I first suspected moto and tried working on a fix there and wrote some tests, but those were passing there, but not in LocalStack.

I've tracked it down to passing the request with call_moto, we would use the request.url, which is the decoded URL, which moto would then use to create its keys internally.

Some examples of the issues:

Passing decoded request URL to moto:

Raw URL: http://s3.localhost.localstack.cloud:4566/test-bucket-0ff73618/test%2540key/
request.url passed to moto: http://s3.localhost.localstack.cloud:4566/test-bucket-0ff73618/test@key/
Key name in moto: test@key/ when it should be test%40key/

Trailing slashes (once the issue above was done)

Raw path: /test-bucket-18ba44f6/test%2540key/
Parsed key in the parser: test%40key (missing the trailing slash, but was properly parsed in moto)
base_url: http://s3.localhost.localstack.cloud:4566/test-bucket-18ba44f6/test@key/
The check for base_url.endswith(key) does not work here, so the trailing slash isn't added.

Changes

Moto needs the raw URL, so we now use get_full_raw_path to get the original request URL, so moto can properly use its logic to decode the URL and create the proper key name out of it.

I've also thought of another scenario where this could be problematic, with special characters and handling slashes, which was already an issue before (this has been changed often, every time werkzeug changes its handling of URL), but sadly there's isn't a catch-all way to check if the request has a trailing slash that needs to be kept.

Took the opportunity to fix the key returned by the multiple List* in the new provider as well, with some AWS validated tests, and removed some TODOs.

@bentsku bentsku added aws:s3 Amazon Simple Storage Service area: asf semver: patch Non-breaking changes which can be included in patch releases labels Sep 13, 2023
@bentsku bentsku self-assigned this Sep 13, 2023
@bentsku bentsku linked an issue Sep 13, 2023 that may be closed by this pull request
1 task
@bentsku bentsku added this to the 2.3 milestone Sep 13, 2023
@github-actions
Copy link

github-actions bot commented Sep 13, 2023

LocalStack Community integration with Pro

       2 files         2 suites   1h 12m 12s ⏱️
2 206 tests 1 709 ✔️ 497 💤 0
2 207 runs  1 709 ✔️ 498 💤 0

Results for commit 26e8226.

♻️ This comment has been updated with latest results.

@coveralls
Copy link

Coverage Status

coverage: 80.134% (+0.01%) from 80.121% when pulling 26e8226 on fix-s3-percent-key into fa10d1d on master.

Copy link
Member

@alexrashed alexrashed left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome, great catch! 🚀

@bentsku bentsku merged commit 6a192a7 into master Sep 14, 2023
27 checks passed
@bentsku bentsku deleted the fix-s3-percent-key branch September 14, 2023 09:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area: asf aws:s3 Amazon Simple Storage Service semver: patch Non-breaking changes which can be included in patch releases
Projects
None yet
Development

Successfully merging this pull request may close these issues.

bug: Special Character, particularly '%' makes S3 gives error
3 participants