[Fix] s3 ls
command fails when UTF-8 is piped
#1844
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Piping the output of the
aws s3 ls
command when used on a bucket that has keys with UTF-8 characters in it, will raise an exception, exit the program and not list all files.E.g.
aws s3 ls s3://<BUCKET-WITH-UTF8> | wc -l
will print the errorencode() argument 1 must be string, not None
and exit, upon reaching the UTF-8 key.It fails because when stdout is being piped
sys.stdout.encoding
isNone
.I am aware that the
PYTHONIOENCODING
environment variable can be set in order to change that, but it seems that the current code is trying to default toascii
and simply failing to default: awscli/customizations/s3/utils.py, line 398 (latest commit on develop)