-
Notifications
You must be signed in to change notification settings - Fork 107
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
When using s3 as a repo, rpm packages having special characters are not getting URL encoded #143
Comments
Can you point us to a repo example? |
@dmacvicar I don't have any public s3 repo to point out, but I can provide an example: |
A '+' in the URL path is a safe char and does not need to be encoded. A download with '+' or '%2B' must deliver the same result. @frezbo Without at least the |
@mlandres thanks for the clarification, but this issue is specifically with s3. S3 required all special characters to be encoded. |
I found a few hints about this here, here and also in the aws documentations. Looks like S3 is using an incorrect variant of URL-escaping, which we do not support. The So you could rename the affected files before creating the repo metadata and uploading the files to S3. I will see whether we can support the broken S3 escaping in libzypp, but this may take a while. We can not simply advice our URL class to encode the |
@sclausson: As @dmacvicar pointed out, libzypp itself needs to be modified first. The Once this is fixed, one could provide a python plugin script signing the URLs. Such a plugin could also workaround S3s broken URL encoding requirements (#143). |
Closing this issue in favor of #96. An enhanced urlresolver plugin would be able to handle the additional escapes required for S3. |
I am using s3 as our repository. So if any packages have special characters, it fails to download as zypper does not URL encode the special characters and s3 denies the download.
The text was updated successfully, but these errors were encountered: