New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove Google CDN reference for jQuery #1327
Conversation
Saw this was brought up once before but with different reasoning, seemed better to open a new issue rather than append to the older one. There may be other reasons to choose to use the CDN, but if it is strictly because it is considered better performance this article seems to indicate that won't be the case in the vast majority of the cases. |
Unless h5bp required a built step to use it, I don't think we can do this. Also, looking forward: Bower + AMD + r.js will likely have a future role in solving this issue. |
I've brought this up before in other issues. I'm in favour of this change because even without a build step, using the CDN doesn't seem to provide significant benefit. But this change would need more work:
|
That's cool. That's great analysis and what I expected to see in terms of the spread of the URLs. I think we ended up sticking with the CDN version, not because of the cache lottery, but because serving it from the Google CDN (faster than the average web server even if it's not cached) is a slightly better default than serving two files from one (slower) server- especially if there's a large geographical spread. Meaning, if we guaranteed that people were using a build tool the best default would be removing the CDN and serving an optimized single file (or some script loader solution- whatever), but since we can't guarantee that it's likely a slightly better default configuration assuming noth else happens after the files leave the repo. For my money, the first thing i do is delete the link to the CDN, but I'm not the target for this particular line of code. |
Speaking of which, I should rerun the tests I ran a few years ago. wpt probably has a million more nodes now- more data 👍 |
I like the Google CDN. It offers speed not dependent on the user's own hosting and server (so if they host a Boilerplate site on a crappy server, loading the jQuery won't take long). And Google's servers should be among the best in terms of uptime and latency. |
One thing that analysis leaves out, that is interesting is a weighting towards the size of the sites using the CDN. For example, if a site like facebook was using a file from the CDN, the odds would be astronomically higher that their specific version would be cached even if they were hosting a unique version. So, while the simple distribution of URLs is interesting (and I think, telling) it's only a part of the picture. I mean, if you have a pinterest friendly audience your odds of http://ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js being in the cache are better than they would be otherwise. |
In the original write up by @zoompf - he mentions that even against your server, in an uncached state, the overhead of an http request doesn't beat just building it into the same file. And further, the DNS lookup to google adds to the issue. The fastest downloads of jQuery in every test I've seen have always been critical pageload resources all in a single file asynchronously in the head. This means that you don't have multiple http requests, and you don't have an additional DNS resolution (which is slow based partly on your provider, not just based on how fast google is). |
I quickly ran some tests with wpt (10 runs in chrome). The default results hold. If you do nothing to the files and just serve them as is (multiple script files), using the Google CDN is faster by default, especially if you have a global audience. The geographical advantage that Google brings continues to matter.
Of course, we don't want people serving multiple files. But if they do, based on this slice, serving it off of the CDN is faster, by default. |
I've tried many options available including a bunch of scripts, concatenating them, and async loading them with RequireJS, head.js and lazyload.js Much to my surprise, the best ratio of speed/not-breaking-the-hell-outta-the-layout was with google js API which makes me stick to cdn's version 1.7.1 |
I'm not sure common research agrees with your results. Do you have some I'm not terribly interested in measuring the total speed of a site with no I think the project should default to making it the easiest to build files |
1.7.1 is the version of jQuery on the Ajax CDN used by Pinterest? Beetlejuice? |
http://statichtml.com/2011/google-ajax-libraries-caching.html According to that, there is a very little benefit to using the Google CDN to serve jQuery source. A better option would be to encourage devs to minify and concatenate all of their script files into one JS file served with a long-lived cache time. This was referenced by Alex Sexton (@SlexAxton) in his talk at jQueryTO March 2nd, 2013. Also: * Included the unminified version of jQuery (in the /js/vendors folder) * Included the jQuery source map file (in the /js/vendors folder) * Included the changes in the CHANGELOG
I updated the file to more closely match what @necolas listed as needed. Not pushing hard for this to be included (lack of build script seems to be a pretty valid reason not to) but I do think it needs to be considered. |
@localpcguy I guess my explanation was too brief. What I meant is:
<script type="text/javascript" src="//www.google.com/jsapi"></script>
<script type="text/javascript">
//<![CDATA[
google.load("jquery", "1");
//]]>
</script> So yes, I agree with you in that the plain loading from Google CDN isn't the best approach, but I'm throwing in my two cents: on a Linode VPS the overhead of serving jQuery myself is above any performance benefit, and instead I'll reccomend an async loader. |
@amenadiel No, I understood what you were saying. I'm just not sure that your example isn't unique to your specific setup as other testing has shown otherwise. And aren't you making 2 requests then to load jQuery (for the Google code and then again for the jQuery code itself)? Fine if you plan to utilize the Google Loader for other things, but probably not best standard practice. Also, as an aside, you should always request a specific version of jQuery rather than the 1.x "latest", because of 1) the potential for breakage following any breaking changes when your site gets auto-updated to the latest jQuery. And 2) you also won't get a long lived cached version, which is one of the "benefits" being discussed here as regards loading scripts from the jQuery CDN using a standard script tag and a specific version. (See http://paulirish.com/2009/caching-and-googles-ajax-libraries-api/ - athough that is a few years old now, my understanding is the caching information is still valid.) |
To bounce off the research from @roblarsen Microsoft.com uses http://ajax.aspnetcdn.com/ajax/jQuery/jquery-1.7.2.min.js as well as Skype.com (MS allows public use of aspnetcdn) In the end, I have to agree with @paulirish build tools are the best bet, though Google/MS CDNs are the safest bet. |
FWIW, I'm working with one of my old colleagues from sapient and looking at this beyond pure distribution in the HTTP Archive dataset. Hopefully something interesting will shake out. |
Lets consider a more fundamental analysis. I can think of 3 reasons for a
So, do JavaScript CDNs actually do any of this? For #1, yes, bandwidth will be reduced. The actual economic savings from #2 is debatable. Creating an HTTP connection to a 3rd party is expensive: #3 is doubtful. The fragmentation of URLs to access these shared libraries Finally, remember we are jumping through all these crazy hoops just to save Billy Hoffman |
I think we should use the CDN and add a note in the documentation that it can be removed. |
I think that zoompf makes some great points. Please let me add some additional thoughts #1: Reducing bandwidth is also important for free/"unlimited" hosts, since they are not so reliable, esp if a blog post link blows up on reddit, etc. #2: Yes, handshaking must be factored in. Then again, most browsers can do like 8 simultaneous connections? #3: I would say #3 is very valid. (IIRC, I looked into using the same CDN jQuery version that Twitter used about a year ago, but now it seems they compress all their JavaScript into a handful of scripts & host those themselves; hitting a moving target is not ideal.) Either way I'd say test test test.But really who is the target market for html5-boilerplate anyway? Web folks who just want a simple framework to slap together a website really fast for a friend or want to build a page to showcase their new library they just wrote? Or a design team who have been contracted by a major corporation with deep pockets to rework their existing site? I would think many if not most can change 1 line of code; so I think alfredxing's suggestion is the best. |
I disagree. I don't think we have enough data to show that, for a web Billy Hoffman On Tue, Mar 12, 2013 at 5:07 PM, Alfred Xing notifications@github.comwrote:
|
I can't speak for everyone @tomByrer, but my view of the target market for H5BP is devs who are looking for the current best practices. I don't view it as a "quick start" guide, necessarily, but rather as a great reference that the community keeps up to date with the latest best practices. That's just the way I see it, I don't speak for the maintainers, obviously. |
I feel like our original discussion on the matter should be linked to for reference: #252. TL;DR:
|
I agree with @zoompf, @tomByrer, and @localpcguy on their latest comments. |
@davidmurdoch Steve favors CDNs as can be read up here: http://www.stevesouders.com/blog/2011/08/17/http-archive-nine-months/ But he did never set up a real world test for this, as far as I know. I argued against it in the comments basically using @zoompf's methodology of calculating the potential "hit rate" of a CDN regarding a pre-cached jQuery library. Which is low. I am for including simple on the fly concatenation tools for a bunch of server side languages bundled up with H5BP. Could be limited to the mainstream languages (like you did include server side config for Apache and NGINX only). That would bring a lot of improvement to the average website, and due to the enormous number of those push the performance of the web a huge step forward. And then leave the build tool in there as the better option for the experienced developers. |
I do think H5BP should not force users using a build tool too much. Most web developers don't know how to deal with tools like grunt.js and they don't want to. Forcing them would lead them to other frameworks they actually are able to use without a build system. For the CDN/ non-CDN thing: I am not sure. For local sites it makes totally sense to serve local jQuery. For world-wide sites it still might make sense to use your own jQuery, especially when we're talking about the modular new jQuery. But again, this requires web developers to use build tools so we can say: H5BP users normally use it as is which means full jQuery. What was left out in prior tests here: You cannot only test Chrome for HTTPRequests as this browser does a pretty good job here. Make sure you test it on IE and other older browsers who can't deal with 12 HTTP Requests or more in parallel. I am in favor of leaving this to tools like Initializr who could offer a selection between CDN and non-CDNed jQuery. (But that's just my little notes here) |
@anselmh All browsers nowadays do 6 parallel requests per host on HTTP (HTTPS can be different). The difference between newer and older browsers are more subtle like DNS prefetching or better connection pool management. Back to topic: On globally targeted sites you should rather use a CDN for all of your static stuff. And only one. Basically meaning that even here you should also concatenate first, and then you can have e.g. an origin pull CDN distribute the result for you. The jQuery CDN only makes sense as being that one notch better than doing nothing. That's all the benefit I see. I totally agree on your arguing that even the best build tool is still one step to much for the average developer. |
Discussion of bower here misses the point of the discussion, I think. The question is "what's the best default way to include jQuery for consumers of HTML5 Boilerplate and their users?" Saying "you should manage your dependencies with Bower" isn't really an answer to that question since it sidesteps the most important part of the equation- what's the fastest default way to serve jQuery to end users. For now, unless something's changed in the past few months, that remains the Google CDN. |
I agree with that 100%. |
If you're interested in this subject at a "why" level, there's a lot of detailed discussion of the issue in this very thread. Just start at the top and work your way down. The short answer is
|
Future jQ versions will have its own AMD loader. Though IMHO so many people use various jQurey plugins, I'm not sure if that will save much bandwidth.
MAYBE 1.69% |
The question is a thin one. Why are we serving from Google and not some other CDN? So, even though the odds aren't great (and pure penetration of any single library isn't the only factor) they're going to be better with Google than with someone else who is less popular. If you read through this thread, you'll see that I think the cache lottery is a canard. It's just that, if the cache lottery matters to you at all, the Google CDN is a much better option than anything else as it's got much higher penetration. Of course, this is about the best default, not the best possible answer. The best possible answer can only be answered by the individual developer testing options on their individual site or application. |
I think @roblarsen nailed it with this statement -
+1 |
Yes, which is why I said "Test test test" before. And why we need to be aware of pros & cons. |
Pinterest would be a great candidate for that technique |
Unfortunately, they don't seem to use jQuery.js at all, perhaps concocted. I was thinking more stackoverflow/StackExchange, who uses |
oh they used to. they did when we had this discussion originally. |
...Most H5BP projects use jQuery via Google CDN. (#oroboros #selfFulfillingProphecy) |
H5BP tries to be relatively agnostic of any high level development philosophy or framework, so that it can be used:
Thank you all for your comments! I'm sure there will be a similar discussion in the near future (especially as things such as SPDY / HTTP 2.0 get more traction), but for the time being, we'll stick with the CDN. |
Not really helpful article:
Like others & I said before; test if you want to tweak. The reason for On Wed, Apr 9, 2014 at 11:49 AM, Cătălin Mariș notifications@github.comwrote:
|
Just to re-iterate: if all h5bp projects use google cdn for jQuery, and if On Wednesday, April 9, 2014, tomByrer notifications@github.com wrote:
Rob Wierzbowski |
@robwierzbowski You must have missed this post; expecting hitting the cache with same version is dismal, esp now 10+ releases later...? |
Perhaps the rise in [AngularJS](http://angularjs.org/)'s popularity be the real cause for the rise in popularity in ajax.googleapis.com? Or maybe "[Sites using Google Libraries API](http://httparchive.org/trends.php#perGlibs)" rise is caused by "[Sites with Custom Fonts](http://httparchive.org/trends.php#perFonts)"? Perhaps a revisit to see if [hitting the version lottery](h5bp#1327 (comment)) is in order, but I doubt it will prove it is worth mentioning "increases the odds of having a copy of the library in your user's browser cache". Even though I'm involved with [jsDelivr CDN](http://www.jsdelivr.com/), and I see alot of [cdnjs](http://cdnjs.com/) usage in the wild, I conclude Google's CDN is the best to use for h5bp. But please, an extra 1% chance of hitting cache isn't really worth it IMHO.
I have seen that and I really appreciate the research. I wonder if there is a way to track a segment of users and see how many times over the course of the day a they use the same version of jQuery between sites. It's possible user behavior could select for sites that have a smaller subset of jQuery versions (I'm a designer and I visit a lot of elitist web design blogs?), and I personally work on a small ecosystem of apps that can benefit from pulling the same version of jQuery between sites. And then there's cache lifetime: the longer a user's cache lives, the more likely they are to have a particular jQuery in their cache. How long does an average internet user need to surf until they have 80% of jQuery versions somewhere in their cache? The likeliness of a particular version being available is a curve that increases over time. Thanks for pointing out the numbers again, and I completely agree that it's a lottery. But I'm not convinced that those numbers alone settle the "should we use a CDN" question. |
Seems with HTTP1.1, CDNs allow more "Max Connections" than local host alone (without sharding local, which is beyond scope of h5bp, & IMHO won't see the benefit of a separate server.).
Perhaps 2 years ago, but devs don't upgrade versions much, so there is a huge range of versions out there. Plus IMHO script collation & there being 4-5 CDNs in use will reduce cache odds a bit more.
I used to worry about that, until Pintrest dropped using Google's CDN (they now collate & CDN themselves). If you are the web dev for a network of sites that link together perhaps that can help. But if you're that big, you should really look into a caching CDN (eg Envato/Tuts+ uses CloudFlare). Also, seems there is a bit of backlash against extra DNS lookups, but:
|
Also removes "load core JS libraries from reddit servers" as a preference since it is no longer needed. Using the google cdn for jquery adds very little benefit, see: http://statichtml.com/2011/google-ajax-libraries-caching.html and h5bp/html5-boilerplate#1327
http://statichtml.com/2011/google-ajax-libraries-caching.html
According to that, there is a very little benefit to using the
Google CDN to serve jQuery source. A better option would be to
encourage devs to minify and concatenate all of their script files
into one JS file served with a long-lived cache time.
This was referenced by Alex Sexton (@SlexAxton) in his talk at
jQueryTO March 2nd, 2013.