Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How do you send gzipped files to S3? django-storages 1.1.6? #368

Open
kevinharvey opened this issue Mar 5, 2013 · 11 comments
Open

How do you send gzipped files to S3? django-storages 1.1.6? #368

kevinharvey opened this issue Mar 5, 2013 · 11 comments

Comments

@kevinharvey
Copy link

I'm REALLY CLOSE to getting this working with django-storages 1.1.4 with the following settings:

AWS_ACCESS_KEY_ID = 'MYKEY'
AWS_SECRET_ACCESS_KEY = 'MYSECRET'

AWS_IS_GZIPPED = True

from S3 import CallingFormat
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
STATICFILES_STORAGE = 'myapp.storage.CachedS3BotoStorage'
COMPRESS_STORAGE = STATICFILES_STORAGE

AWS_STORAGE_BUCKET_NAME = 'my-bucket'
AWS_S3_CUSTOM_DOMAIN = AWS_STORAGE_BUCKET_NAME + '.s3.amazonaws.com'
from S3 import CallingFormat
AWS_CALLING_FORMAT = CallingFormat.SUBDOMAIN
AWS_S3_SECURE_URLS = True
AWS_S3_FILE_OVERWRITE = False

COMPRESS_URL = STATIC_URL = 'https://' + AWS_S3_CUSTOM_DOMAIN + '/'

myapp/storage/CachedS3BotoStorage.py is set up as described in Issue 100. I run compress with the following command:

python manage.py compress --force --extension='.djhtml'

This creates appropriately gzipped files in myapp/static/CACHE, and puts files with the correct name to S3 with the correct CONTENT-ENCODING metadata, but the files on S3 are emtpy (0 KB). I can manually upload these files to S3, reset the metadata and permissions, and everything works.

I noticed that @jezdez had committed a bunch of code to django-storages (much of which dealt with gzipping), so I tried updating mine to 1.1.6. Running the same command produced a hash-named file for every one of my source files, but did not minify or compress them in anyway and did not push any files to S3.

Could this issue be addressed in django-storages 1.1.6? If so, is there any documentation around that? I'd be happy to help update the documentation if someone can point me to the code that will help me figure this out.

@m-misseri
Copy link

same issue here, my files on s3 are 0KB, with gzip. AWS_IS_GZIPPED = True

@nkeilar
Copy link

nkeilar commented May 21, 2013

I'm also having issues, but i can't identify exactly where the problem is yet.

@nkeilar
Copy link

nkeilar commented May 21, 2013

@bedspax @kcharvey Did you guys figure out a solution?

@m-misseri
Copy link

No dude @madteckhead

@kevinharvey
Copy link
Author

Not yet, was planning on addressing it in my deployment scripts outside of django_compressor @madteckhead

@nrvnrvn
Copy link

nrvnrvn commented May 30, 2013

@madteckhead @bedspax @kcharvey Seems to work for me with sekizai and {% render_block "css" postprocessor "compressor.contrib.sekizai.compress" %} on the fly, but I didn't try to python manage.py compress --force. If interesting, can post my settings.

@epicserve
Copy link

When using AWS_IS_GZIPPED = True with the following relevant libraries installed and settings, Django_compressor stops compressing CSS and JS.

# Relevant Libraries
django-compressor==1.3
Django==1.5.1
django-storages==1.1.8
boto==2.9.5
# Relevant Settings
STATIC_URL = local_settings.STATIC_URL
MEDIA_URL = local_settings.MEDIA_URL
DEFAULT_FILE_STORAGE = 'utils.s3utils.MediaRootS3BotoStorage'
STATICFILES_STORAGE = 'utils.s3utils.StaticRootS3BotoStorage'
THUMBNAIL_DEFAULT_STORAGE = DEFAULT_FILE_STORAGE
AWS_S3_SECURE_URLS = False
AWS_QUERYSTRING_AUTH = False
AWS_ACCESS_KEY_ID = local_settings.AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY = local_settings.AWS_SECRET_ACCESS_KEY
AWS_STORAGE_BUCKET_NAME = local_settings.AWS_BUCKET_NAME
AWS_PRELOAD_METADATA = True
AWS_IS_GZIPPED = False
AWS_HEADERS = {
    'Expires': 'Thu, 19 Apr 2040 20:00:00 GMT',
    'Cache-Control': 'max-age=86400',
}

# DJANGO COMPRESSOR SETTINGS
COMPRESS_URL = local_settings.STATIC_URL
COMPRESS_STORAGE = STATICFILES_STORAGE
# s3utils.py
from storages.backends.s3boto import S3BotoStorage

StaticRootS3BotoStorage = lambda: S3BotoStorage(location='static')
MediaRootS3BotoStorage = lambda: S3BotoStorage(location='media')

@mateuspadua
Copy link

I fixed this problem this way:

The problem is use the same [parameter content] to save local and to upload to S3. Because when we passing the parameter content to super(CachedS3BotoStorage, self).save(name, content), internally the parameter content is modified, so when passed to next .save _self.local_storage.save(name, content) he is with other state due the conf AWS_IS_GZIPPED, that compress the file inside of content.
To fix this simply create one copy of content parameter.

In settings.py

STATICFILES_STORAGE = 'mypackage.s3utils.CachedS3BotoStorage'

in mypackage.s3utils.py

from storages.backends.s3boto import S3BotoStorage
from compressor.storage import CompressorFileStorage
from django.core.files.storage import get_storage_class
import copy

class CachedS3BotoStorage(S3BotoStorage):
    """
    S3 storage backend that saves the files locally, too.
    """

    location = 'static'

    def __init__(self, *args, **kwargs):
        super(CachedS3BotoStorage, self).__init__(*args, **kwargs)
        self.local_storage = get_storage_class(
            "compressor.storage.CompressorFileStorage")()

    def url(self, name):
        """
        Fix the problem of dont show the natives images django admin
        """
        url = super(CachedS3BotoStorage, self).url(name)
        if name.endswith('/') and not url.endswith('/'):
            url += '/'
        return url

    def save(self, name, content):
        content2 = copy.copy(content) #-> THE SECRET IS HERE
        name = super(CachedS3BotoStorage, self).save(name, content)
        self.local_storage._save(name, content2) #-> AND HERE
        # print id(content)
        # print id(content2)
        return name

    def get_available_name(self, name):
        if self.exists(name):
            self.delete(name)
        return name

@bartmika
Copy link

Thanks @mateuspadua! Your fix worked for me!

@mateuspadua
Copy link

:)

@shubug
Copy link

shubug commented Jan 6, 2017

Can i change the path where my compressed file is getting stored on S3. Right now it is getting stored in a separate folder 'CACHE' in my bucket. How can i make it store inside the static folder of my bucket i.e my_bucket/static/CACHE/ . Please suggest a way out.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

9 participants