New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
UnicodeDecodeError 'utf8' codec GZIP, S3, Cloudfront #404
Comments
Should gzipped files be uploaded with a .gz extension. I just ask because for me they aren't with the setup above. Seems when with this type of backend
the compress files are saved in without the gz extension. For some reason this then causes CommandError: An error occured during rendering /home/user/workspace/noshlyfs/templates/theme_bootstrap/less_base.html: UnicodeDecodeError while processing '/home/user/workspace/noshlyfs/static_root/cache/js/a1576bd8c653.js' with charset utf-8: 'utf8' codec can't decode byte 0x8b in position 1: invalid start byte |
Seems this is only happening with the Not sure what is up with the command, but while I've got things working I believe it is still a bug so I'll leave this issue open. |
I spent some time getting this to work, what it took was this modification to the CachedS3BotoStorage class. Note the difference in the save method.
The problem is that the _save method of S3BotoStorage modifies content.file to be gzipped as a side effect of its invocation when AWS_IS_GZIPPED = True. When the file is then saved locally after being saved to S3, the content is now gzipped, and when compressor tries to read it from local disk it doesn't know what to do with it. In general these sort of side effect changes are bad, having _save in S3BotoStorage not modify 'content' would be ideal, but at least there's a work around. I will submit a pull request to update the docs to use this version of CachedS3BotoStorage. |
This fixes the problem when compressing after pushing to S3 Implemented based on this comment on the issue: Details: django-compressor/django-compressor#404 (comment)
@vinaytota I spent some time trying your fix, and no luck here. Ended doing a hack to uncompress gzip-encoded files if needed, before consumption. See here: ulyssesv/django-cached-s3-storage#2 Kinda spartan, but finally worked, and very clear. No more "UnicodeDecodeError 'utf8' codec" errors on my side. Is there something nasty that I had not noticed on this approach? |
This fixes the problem when compressing after pushing to S3 Implemented based on this comment on the issue: Details: django-compressor/django-compressor#404 (comment)
@vinaytota why not just do the local_storage call first? Are there other side-effects I might be missing? def save(self, name, content):
self.local_storage._save(name, content)
return super(CachedS3BotoStorage, self).save(name, content) EDIT: Looks like my proposed simplification leads to an empty file getting stored on S3, so stick with @vinaytota’s approach: def save(self, name, content):
non_gzipped_file_content = content.file
name = super(CachedS3BotoStorage, self).save(name, content)
content.file = non_gzipped_file_content
self.local_storage._save(name, content)
return name |
I'm trying to get GZIP working with django-storage, using s3 with cloudfront. Everything is working but GZIP. I did have it working with the compressor css file, but the js file never seems to compress.
Similar/related to:
Comments of interest
pip freeze
which is django compressor 1.3 develop
also tried django-storages==1.1.5 first
Storages
settings.py
running
python manage.py compress --force
When I have 'this line' commented or uncommented I get:
The text was updated successfully, but these errors were encountered: