Skip to content


Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP


Gzip files not being used from cloudfront #153

radanskoric opened this Issue · 14 comments

7 participants


Using the asset_sync gem with cloudfront the way it is described in the README will mean that gzip-ed versions of the assets will never be used. Rails will generate normal asset urls (ending in .js, .css, ...) , browser will request them with Accept-Encoding: gzip, ... and Cloudfront will just serve the uncompressed version of the file.

It can be seen when you monitor the traffic on your machine and it is explained in documentation here:

For it to use the gziped versions, Rails app needs to be modified to generate asset urls with .gz version if the client supports gziped assets, but also asset_sync needs to set the Content-Type in S3 metadata of the gziped version to match the original file type (application/javascript, text/css, .. etc) instead of gzip file type.

Am I missing something? Can somebody confirm that they have gziped assets being served from cloudfront with app setup as decribed in asset_sync README?

It seems to me this is a big deal since in most cases downloading a gzipped version from origin will be faster than downloading ungzipped version from cdn edge location.


I have definitely confirmed that the gziped files are not being served.

Using gzip_compression = true would probably help however it is not ideal since you are then making the assumption that all clients will accept gzip encoding.

The solution does exist. I'm now using cloudfront distribution with custom origin and taking advantage of the behavior described here:

Basically you first make sure your rails app is correctly serving gzipped assets. I used code from this gist:

Then you just point the cloudfront distribution to your app and use it as asset host.

That solution doesn't require asset_sync to work. It will not cause problems if you want to use it for other reasons, but it is not necessary.


I'm having a similar problem, but I think your solution is the wrong way around.

So, I have gzip_compression = true set. However, it looks like asset_sync is still uploading both versions of my files to S3 - let's say, styles.css and styles.css.gz. A client request for styles.css to Cloudfront or S3 with Accept-Encoding: gzip causes Cloudfront to send back the non-gzipped styles.css, rather than the gzipped styles.css.gz. I think this is actually the correct behavior.

Why isn't asset_sync automatically overwriting styles.css with styles.css.gz? It also seems from the README that this is what the default behavior is.

If it matters, I'm using turbo-sprockets.


A-ha! I've figured it out! (at least for me).

So, Cloudfront just forwards the accept-encoding header on to the Origin. So, I checked to see what happened when I request styles.css from S3...and, the uncompressed version was returned! A-ha! So S3 doesn't do what some other CDNs do which is to automatically serve the gzipped version, even though the .css version was requested. Asset_sync's default behavior (to circumvent this, I suppose) is to rename the *.css.gzip version to *.css, solving the problem.

However, my asset_sync install definitely wasn't doing that - it was uploading both versions, every time. What the hell? Then I noticed a funny line in my output:

AssetSync: using default configuration from built-in initializer

Oh! AssetSync wasn't recognizing my config anymore! I had moved the file from the standard location ('config/initializers/asset_sync.rb) into a subfolder, and asset_sync didn't detect it and was ignoring my gzip_compression config flag. Setting ASSET_SYNC_GZIP_COMPRESSION in my environment worked like a charm, and everything's working properly.

So, to answer OP:

  • I'm pretty sure what Cloudfront supports compressed files with accept-encoding as you described, as long as the origin supports them. Make sure your origin is performing properly first.
  • It sounds like your Rails app wasn't serving gzipped assets - this is your problem, not asset_sync.

Also, is there any way to do a more intelligent subfolder search for the asset_sync config? I can't be the only guy who organizes our initializers into subfolders.


Yes, the problem lies on S3, S3 does nothing regarding the Accept-Encoding headers. Notice that in my comments I mentioned that setting gzip_compression to true would solve the problem. However, you are then making the assumption that all of your clients will be connecting with Accept-Encoding: gzip headers. This might be true, but it is up to you to decide if that tradeoff is acceptable. I decided it isn't.

That is why i decided to point cloudfront to heroku as it's origin. Now, Rails on Heroku out of the box does not support Accept-Encoding: gzip because there is no nginx or apache in front of the rails server. The gist I linked to in my solution fixes that.

What you might be thinking of when you say that my solution is backwards is that a file name something.css should be served and not something.css.gz. That is indeed correct, the name of the file served should be the name requested, and its Content-Type header should be what was requested. However, actual Content and Contet-Encoding header may be gzip if that was requested by the client. If you look at the gist code containing the middleware that adds the desired behaviour you will see that is exactly what it is doing, inspecting request headers and returning the .gz file but under its plain name.

I have asset serving working correctly with the approach I described but my concern is that if someone follows the instructions outlined in asset_sync readme to the letter and everything works as expected they will end up with a set up where gziped assets are not being used and that is little tricky to notice if you are not explicitly looking for it.


@radanskoric What browsers are you supporting that don't support gzip? That's gotta suck, even IE6 SP1 supports it.

In any case, I don't see an issue with asset_sync's behavior in this case, unless you have a PR to change the docs.


I'm not supporting browsers that don't support gzip. I should be fine, but a lot of users will be behind campus proxies and firewalls and I'm just not sure how they might handle the http headers and if they might change the Accept-Encoding header, so I'd like to be on the safe side and honor it by returning the plain version if gzip is not requested.

No, there's no issue with asset_sync, it works as advertised, it's just that S3 is not the best choice for Cloudfront origin server in this case. I can set up a PR to change the docs, but it'll have to say something along the lines of: "Without gzip_compression flag enabled you will be always serving the uncompressed version of the asset, but if all you want to do is serve assets from cloudfront, it's easier not to use this gem, but do it like this ...."

That is why I opened this issue, to first check if I'm missing something.


I am trying to do the solution described by @radanskoric with this gem (uses that Gist), but something weird is happening.

When cache is enabled (through config.cache_store = :dalli_store), seems that Heroku is responding always without gzip, then the CDN uses that. If I remove that cache, it works, however I cant do that because it is used by the app.

I though that setting "config.serve_static_assets" to false might work, but read that Heroku overrides that config to true.

Did you face any problem like that?


@brunogh Yes, I had the exact same problem. Basically you need to make sure that the middleware serving the assets is before the cache in the middleware stack. That way the Rack::Cache middleware will never get hit for static assets.

Since I'm setting up the middleware manually, this is what I'm doing in my environment/production.rb:

  # Serve pre-gzipped static assets
    "Rack::Cache", Middleware::CompressedStaticAssets,
    paths["public"].first, config.assets.prefix, { 'Cache-Control' => "public, max-age=31536000" })

The middleware is made from the Gist I mentioned in OP.
Hope that helps.


Awesome! Thanks @radanskoric!

In the case I am using heroku-deflater, which has this init (, should I put "Rack::Cache" after ActionDispatch::Static? That HerokuDeflater::ServeZippedAssets is the same as your Middleware::CompressedStaticAssets.


@resdigitais Yes, I think that should be safe to do. If you have a CDN setup for asset files, all of middleware listed in that heroku-deflater initializer has no need for Rack::Cache and actually needs to be in front of it.

Just to be on the safe side, I suggest you manually inspect the listing of your middleware stack for production environment to make sure none of the middleware concerning dyanmic requests has ended up in front or Rack::Cache .


Cheers! Gave up of the gem, actually because was having problems with Rack version and did the same as you described:

use Middleware::CompressedStaticAssets
use Rack::Cache
use ActionDispatch::Static


Closing, because this is a complete misunderstanding of the gzip setting in asset sync. Asset sync allows for this.

Because of S3's limitations on the Accept-Vary header. We need to replace the .css file with the contents of the .css.gz file. Therefore being absolutely transparent to the rails application. Just because a file extension is .css does not mean it is not gzipped. Check your headers.

@davidjrice davidjrice closed this

What is currently best way to achieve serving gziped versions of assets from S3 with Rails 4 on Heroku. Default installation & usage does not serve gzip files from S3 bucket. I tried heroku-deflater but it doesn't work as excepted. Thanks !


@tzoro heroku-deflater I guess it is useful if you serve asset from your app, but has no effect if you use S3.
From my understanding, asset_sync (when gzip_compression true) would detect whether there is a file with same name (but with gz suffix) and just upload to the same path while setting Content-Encoding to gzip so that browser would decompress it before using it.

You can see readme for config

Edit: Crap I press Enter before I finished

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.