Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

Already on GitHub? Sign in to your account

Gzip compression #916

Closed
louisjc opened this Issue Jun 24, 2015 · 6 comments

Comments

Projects
None yet
4 participants
Contributor

louisjc commented Jun 24, 2015

I think the server should serve gzip compressed static files. I did a quick test on the css file and the result save 36,8Ko on this single file.

$ gzip -9 3bd78ec57e039f25d4b01686f89d9105.css
$ ls -lh
  45K 3bd78ec57e039f25d4b01686f89d9105.css
  8,2K 3bd78ec57e039f25d4b01686f89d9105.css.gz

I think the best way to do this is:
Make a script that create .gz for every js, css and svg on the server on push (To avoid spaming this git with gz files) and then Add "gzip_static on" in the nginx configuration for those files only (To avoid on the fly compress of others files)

The "gzip on" is simpler to implement but it compress on the fly each file so I think this is a waste of cpu resources (and response time) because those files aren't changing often.

Contributor

schildbach commented Jun 24, 2015

I would assume (but don't know) web servers cache their gzipped versions of static pages. Thus, gzipping on git push feels like almost useless to me.

Contributor

saivann commented Jun 24, 2015

@louisjc I think it's possible to write a jekyll plugin that will automatically make a gzip copy of all .css and .js files when the site is building, in case you're interested. It's worth to note that static content is only loaded once (agressive caching). So compressing static files probably would only make a small difference.

@schildbach I always read that gzip slightly increases the CPU load for most websites. I think that would deserve more testing before enabling gzip compression on the fly, especially given that the website has to deal with DDoS attacks once in a while and the performance improvement isn't high.

Contributor

harding commented Jun 25, 2015

@saivann

I always read that gzip slightly increases the CPU load for most websites.

I would think that if the files were pre-gziped, it would require the same CPU to serve them as serving any other file. (For example, IIRC, png files use zlib compression.)

I also think there might be plugins out there already for Jekyll that gzip all files after building the site. For example, I just skimmed this blog post. I don't know what all the tradeoffs would be.

Contributor

saivann commented Jun 25, 2015

I would think that if the files were pre-gziped, it would require the same CPU to serve them as serving any other file.

Sure! Using gzip_static on as @louisjc suggested doesn't seem to present any performance issue, and only saves bandwidth AFAIK. I was referring to the second suggestion of letting the server do gzip compression by itself on the fly (gzip on). With a quick reading of the blog post you have outlined, it appeared to me that generating all .gz files could likely be as simple as a single line of code in our build script, which would be more simple than writing a jekyll plugin.

Edit: Browser compatibility seems fine, with only IE6 not supporting compression (and it seems possible deal with that issue with a simple gzip_disable "msie6"; configuration line).

Contributor

harding commented Jun 25, 2015

@saivann oh, sorry for the confusion. Changing the build script would indeed be ideal, as it would only affect the content on the build server, and would leave the content in our individual development environments alone so we could continue to do things like diffing the output from different builds.

Contributor

harding commented Jun 25, 2015

@saivann oh, gzipping via the build script would break the manual-check-diff-sha256sums Makefile command, but I wouldn't worry about that for now. I can probably fix that the next time I need it.

@louisjc louisjc closed this Jun 26, 2015

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment