Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conserve bandwidth? #200

Open
takluyver opened this issue Apr 26, 2019 · 14 comments
Open

Conserve bandwidth? #200

takluyver opened this issue Apr 26, 2019 · 14 comments

Comments

@takluyver
Copy link
Member

I'm starting to get emails from Netlify saying that we're reaching 50% of our 100 GB bandwidth quota. This is nice - we're reaching lots of people! And it's not an urgent problem yet, because the quota is about to reset for another month.

But if the bandwidth is steadily increasing, then before long the site will either start going down or we have to start paying. It's $20/month for an extra 100 GB, which is a lot for any of us to pay out of our own pocket. And if we hit 100 GB/month, it's not hard to imagine hitting 200 GB.

So, let's see if we can make it more efficient. Today, a page load includes 3.17 MB of resources from python3statement.org (i.e. excluding files loaded from separate CDNs). Most of that is the logos. So for starters, I'm going to see if we can squash some of the images (#165).

This was referenced Apr 26, 2019
@takluyver
Copy link
Member Author

Thanks @hugovk for diving in on compressing images. With those changes, a page load now pulls about 1.1 MB, so we're saved nearly 2/3. That should make the quota last much better!

I'll leave this open as a place to discuss any further improvements.

@takluyver
Copy link
Member Author

Oh, and to record: we're currently compressing images with pngcrush, but @hugovk also investigated pngquant. This does lossy compression, but the results looked pretty good to my eyes. The space saving was on the order of 20% better than pngcrush. We can revisit this if we need to optimise it more.

@Carreau
Copy link
Member

Carreau commented Apr 26, 2019

Thanks for working on this.

I received the notifications as well, and I believe we still have quite a bit of margin.

Signac.png is still one of the biggest one (250Kb), I think that just reducing its resolution by 1/2 manually would still be quite a gain.

@CAM-Gerlach
Copy link
Contributor

CAM-Gerlach commented May 3, 2019

FYI, optipng (originally a pngcrush fork) is the preferred alternative; nowadays; can give substantially better compression while still being fully lossless, at least as of a number of tests from a few years back (e.g. this one, with optipng at anywhere from 95% to 35% of the PNGcrush size, depending on the image).

@hugovk
Copy link
Contributor

hugovk commented May 3, 2019

Thanks for the tip, it can squeeze out a few extra bytes:

$ optipng --help
Synopsis:
    optipng [options] files ...
Files:
    Image files of type: PNG, BMP, GIF, PNM or TIFF
Basic options:
    -?, -h, -help	show this help
    -o <level>		optimization level (0-7)		[default: 2]
    -v			run in verbose mode / show copyright and version info
General options:
    -backup, -keep	keep a backup of the modified files
    -clobber		overwrite existing files
    -fix		enable error recovery
    -force		enforce writing of a new output file
    -preserve		preserve file attributes if possible
    -quiet, -silent	run in quiet mode
    -simulate		run in simulation mode
    -out <file>		write output file to <file>
    -dir <directory>	write output file(s) to <directory>
    -log <file>		log messages to <file>
    --			stop option switch parsing
Optimization options:
    -f <filters>	PNG delta filters (0-5)			[default: 0,5]
    -i <type>		PNG interlace type (0-1)
    -zc <levels>	zlib compression levels (1-9)		[default: 9]
    -zm <levels>	zlib memory levels (1-9)		[default: 8]
    -zs <strategies>	zlib compression strategies (0-3)	[default: 0-3]
    -zw <size>		zlib window size (256,512,1k,2k,4k,8k,16k,32k)
    -full		produce a full report on IDAT (might reduce speed)
    -nb			no bit depth reduction
    -nc			no color type reduction
    -np			no palette reduction
    -nx			no reductions
    -nz			no IDAT recoding
Editing options:
    -snip		cut one image out of multi-image or animation files
    -strip <objects>	strip metadata objects (e.g. "all")
Optimization levels:
    -o0		<=>	-o1 -nx -nz				(0 or 1 trials)
    -o1		<=>	-zc9 -zm8 -zs0 -f0			(1 trial)
    		(or...)	-zc9 -zm8 -zs1 -f5			(1 trial)
    -o2		<=>	-zc9 -zm8 -zs0-3 -f0,5			(8 trials)
    -o3		<=>	-zc9 -zm8-9 -zs0-3 -f0,5		(16 trials)
    -o4		<=>	-zc9 -zm8 -zs0-3 -f0-5			(24 trials)
    -o5		<=>	-zc9 -zm8-9 -zs0-3 -f0-5		(48 trials)
    -o6		<=>	-zc1-9 -zm8 -zs0-3 -f0-5		(120 trials)
    -o7		<=>	-zc1-9 -zm8-9 -zs0-3 -f0-5		(240 trials)
    -o7 -zm1-9	<=>	-zc1-9 -zm1-9 -zs0-3 -f0-5		(1080 trials)
Notes:
    The combination for -o1 is chosen heuristically.
    Exhaustive combinations such as "-o7 -zm1-9" are not generally recommended.
Examples:
    optipng file.png						(default speed)
    optipng -o5 file.png					(slow)
    optipng -o7 file.png					(very slow)
command assets size time to run
None, current master 859,489 bytes (979 KB on disk) for 64 items n/a
optipng -o7 assets/*.png 844,492 bytes (967 KB on disk) for 64 items 1m30s
optipng -o7 -zm1-9 assets/*.png 843,801 bytes (967 KB on disk) for 64 items 7m2s

@hugovk
Copy link
Contributor

hugovk commented May 4, 2019

Please see PR #211.

@Carreau
Copy link
Member

Carreau commented Jun 19, 2019

See also #238, we got the notification again; so i think it becomes reasonable to do lossy compression.

@takluyver
Copy link
Member Author

I don't know how the notification timing lines up with billing periods - how close are we getting to the bandwidth limit?

The next step if we need to save more would be to only allow logos for projects above a certain notability threshold. Setting the threshold to 1000 stars would currently cut off about half the list (dateutil has 1001 stars). fecon235 has 503.

@hugovk
Copy link
Contributor

hugovk commented Jun 20, 2019

Netlify is definitely useful for PRs, but what's the benefit of having the main site hosted on Netlify versus GitHub Pages?

Does Netlify run some build steps to serve the site? Can it be easily refactored to be static?

@hugovk hugovk mentioned this issue Jun 20, 2019
@Carreau
Copy link
Member

Carreau commented Jun 20, 2019

It was a question of custom domain and https IIRC; it's just a jekyll build github pages should be fine.

To answer thomas question; we still have some margin, we use 80% of our allowance:

Screen Shot 2019-06-20 at 10 56 52 AM

@takluyver
Copy link
Member Author

Github pages can do https with custom domains now, IIRC.

Presumably there are other sites hosted on Github pages which would consume this much bandwidth? It's pretty much a de-facto standard for free static hosting now, and I haven't heard about people hitting bandwidth limits.

@CAM-Gerlach
Copy link
Contributor

Github pages can do https with custom domains now, IIRC.

Yup, they've had that for at least a year or two now, its all automatic with Let's Encrypt. You just check a box and a few hours later, HTTPS enforcement.

it's just a jekyll build github pages should be fine.

Yeah, GHP is Jekyll by default but all my sites are built on the Lektor static site generator instead because its Python-based, much nicer and even provides an admin panel UI to help others manage you site who may not understand the internals. I just run them with a simple build script through Travis CI, it only takes a few lines in my .travisci and everything works great.

@bignose-debian
Copy link

Is it feasible to allow SVG logos? Typically, an SVG document representing a logo will be a lot smaller than the equivalent PNG file.

@takluyver
Copy link
Member Author

There's one SVG logo already there (Dask). I'd be wary of going that way overall, because it's not just a matter of "do various browsers support SVG", it's "do various browsers support the SVG features used in this file", and I don't know a quick way of assessing that.

The saving with SVG is also less for small logos than it would be for larger images. Although when we're trying to shave off bytes from the PNGs, I guess that argument doesn't go very far.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants