Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove Google CDN reference for jQuery #1327

Closed
wants to merge 1 commit into from

Conversation

localpcguy
Copy link
Contributor

http://statichtml.com/2011/google-ajax-libraries-caching.html

According to that, there is a very little benefit to using the
Google CDN to serve jQuery source. A better option would be to
encourage devs to minify and concatenate all of their script files
into one JS file served with a long-lived cache time.

This was referenced by Alex Sexton (@SlexAxton) in his talk at
jQueryTO March 2nd, 2013.

@localpcguy
Copy link
Contributor Author

Saw this was brought up once before but with different reasoning, seemed better to open a new issue rather than append to the older one. There may be other reasons to choose to use the CDN, but if it is strictly because it is considered better performance this article seems to indicate that won't be the case in the vast majority of the cases.

@paulirish
Copy link
Member

Unless h5bp required a built step to use it, I don't think we can do this.
The best thing we can do is strongly promote build tools.

Also, looking forward: Bower + AMD + r.js will likely have a future role in solving this issue.

@necolas
Copy link
Member

necolas commented Mar 7, 2013

I've brought this up before in other issues. I'm in favour of this change because even without a build step, using the CDN doesn't seem to provide significant benefit. But this change would need more work:

  • Remove the extra comment about "why we're not using the CDN".
  • Move mention of the reason to the JS docs file.
  • Include the unminified version of jQuery.
  • Include the jQuery source map file.
  • Include a line about the change in the CHANGELOG.

@roblarsen
Copy link
Member

That's cool. That's great analysis and what I expected to see in terms of the spread of the URLs.

I think we ended up sticking with the CDN version, not because of the cache lottery, but because serving it from the Google CDN (faster than the average web server even if it's not cached) is a slightly better default than serving two files from one (slower) server- especially if there's a large geographical spread. Meaning, if we guaranteed that people were using a build tool the best default would be removing the CDN and serving an optimized single file (or some script loader solution- whatever), but since we can't guarantee that it's likely a slightly better default configuration assuming noth else happens after the files leave the repo.

For my money, the first thing i do is delete the link to the CDN, but I'm not the target for this particular line of code.

@roblarsen
Copy link
Member

Speaking of which, I should rerun the tests I ran a few years ago. wpt probably has a million more nodes now- more data 👍

@alfredxing
Copy link

I like the Google CDN. It offers speed not dependent on the user's own hosting and server (so if they host a Boilerplate site on a crappy server, loading the jQuery won't take long). And Google's servers should be among the best in terms of uptime and latency.

@roblarsen
Copy link
Member

One thing that analysis leaves out, that is interesting is a weighting towards the size of the sites using the CDN. For example, if a site like facebook was using a file from the CDN, the odds would be astronomically higher that their specific version would be cached even if they were hosting a unique version. So, while the simple distribution of URLs is interesting (and I think, telling) it's only a part of the picture. I mean, if you have a pinterest friendly audience your odds of http://ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js being in the cache are better than they would be otherwise.

@SlexAxton
Copy link
Contributor

In the original write up by @zoompf - he mentions that even against your server, in an uncached state, the overhead of an http request doesn't beat just building it into the same file. And further, the DNS lookup to google adds to the issue. The fastest downloads of jQuery in every test I've seen have always been critical pageload resources all in a single file asynchronously in the head. This means that you don't have multiple http requests, and you don't have an additional DNS resolution (which is slow based partly on your provider, not just based on how fast google is).

@roblarsen
Copy link
Member

I quickly ran some tests with wpt (10 runs in chrome). The default results hold. If you do nothing to the files and just serve them as is (multiple script files), using the Google CDN is faster by default, especially if you have a global audience. The geographical advantage that Google brings continues to matter.

location            one server    google cdn
dulles, usa         1.651         1.275
jiangsu, china      3.484         2.357
sao paolo           2.663         2.019
delhi               4.298         2.977
LA                  2.108         1.74
-------------------------------------------------------------------------
AVG                 2.8408        2.0736

Of course, we don't want people serving multiple files. But if they do, based on this slice, serving it off of the CDN is faster, by default.

@ffflabs
Copy link

ffflabs commented Mar 7, 2013

I've tried many options available including a bunch of scripts, concatenating them, and async loading them with RequireJS, head.js and lazyload.js

Much to my surprise, the best ratio of speed/not-breaking-the-hell-outta-the-layout was with google js API which makes me stick to cdn's version 1.7.1

@SlexAxton
Copy link
Contributor

I'm not sure common research agrees with your results. Do you have some
benchmarks we could see? What does 1.7.1 have yo do with this?

I'm not terribly interested in measuring the total speed of a site with no
builds. It's a pretty easy win to put your files together and if you aren't
doing that, you likely don't really care about optimizing your speed.

I think the project should default to making it the easiest to build files
together in production.

@roblarsen
Copy link
Member

I'm not sure common research agrees with your results. Do you have some
benchmarks we could see? What does 1.7.1 have yo do with this?

1.7.1 is the version of jQuery on the Ajax CDN used by Pinterest? Beetlejuice?

http://statichtml.com/2011/google-ajax-libraries-caching.html

According to that, there is a very little benefit to using the
Google CDN to serve jQuery source.  A better option would be to
encourage devs to minify and concatenate all of their script files
into one JS file served with a long-lived cache time.

This was referenced by Alex Sexton (@SlexAxton) in his talk at
jQueryTO March 2nd, 2013.

Also:
* Included the unminified version of jQuery (in the /js/vendors folder)
* Included the jQuery source map file (in the /js/vendors folder)
* Included the changes in the CHANGELOG
@localpcguy localpcguy closed this Mar 7, 2013
@localpcguy localpcguy reopened this Mar 7, 2013
@localpcguy
Copy link
Contributor Author

I updated the file to more closely match what @necolas listed as needed. Not pushing hard for this to be included (lack of build script seems to be a pretty valid reason not to) but I do think it needs to be considered.

@ffflabs
Copy link

ffflabs commented Mar 7, 2013

@localpcguy I guess my explanation was too brief. What I meant is:

  1. Serving all my scripts minified together with jQuery brought no noticeable speed increase over requesting Google's CDN jquery and my scripts separately
  2. Asynchronous loading, in turn, brought performance increase but
  3. Most popular async loaders, including Require JS with-bundled-jquery, HeadJS and LazyLoadJS broke my layout. I don't know why and after a few hours of debugging I decided to try Google JS API and
  4. Google JS brought me the speed of async loading and didn't break my layout, but you depend on what version of jQuery will the API offer. In my case, when I request for version 1 (as in "latest release from 1.x series") Google sends you to 1.7.1. You can manually choose another one, but you'll have to cope with shorter cache headers.
<script type="text/javascript" src="//www.google.com/jsapi"></script>
<script type="text/javascript">
//<![CDATA[
    google.load("jquery", "1");
//]]>
</script>

So yes, I agree with you in that the plain loading from Google CDN isn't the best approach, but I'm throwing in my two cents: on a Linode VPS the overhead of serving jQuery myself is above any performance benefit, and instead I'll reccomend an async loader.

@localpcguy
Copy link
Contributor Author

@amenadiel No, I understood what you were saying. I'm just not sure that your example isn't unique to your specific setup as other testing has shown otherwise. And aren't you making 2 requests then to load jQuery (for the Google code and then again for the jQuery code itself)? Fine if you plan to utilize the Google Loader for other things, but probably not best standard practice.

Also, as an aside, you should always request a specific version of jQuery rather than the 1.x "latest", because of 1) the potential for breakage following any breaking changes when your site gets auto-updated to the latest jQuery. And 2) you also won't get a long lived cached version, which is one of the "benefits" being discussed here as regards loading scripts from the jQuery CDN using a standard script tag and a specific version. (See http://paulirish.com/2009/caching-and-googles-ajax-libraries-api/ - athough that is a few years old now, my understanding is the caching information is still valid.)

@tomByrer
Copy link

To bounce off the research from @roblarsen
StackExchange.com uses http://ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js also (& they rarely upgrade).
GuardianNews.com uses http://ajax.googleapis.com/ajax/libs/jquery/1.8.1/jquery.min.js (I think they did a refresh recently, so I don't see an upgrade coming soon)

Microsoft.com uses http://ajax.aspnetcdn.com/ajax/jQuery/jquery-1.7.2.min.js as well as Skype.com (MS allows public use of aspnetcdn)
Hat tip: http://w3techs.com/technologies/details/js-jquery/all/all

In the end, I have to agree with @paulirish build tools are the best bet, though Google/MS CDNs are the safest bet.

@roblarsen
Copy link
Member

FWIW, I'm working with one of my old colleagues from sapient and looking at this beyond pure distribution in the HTTP Archive dataset. Hopefully something interesting will shake out.

@ghost
Copy link

ghost commented Mar 12, 2013

Lets consider a more fundamental analysis. I can think of 3 reasons for a
website wanting to download jquery or other common libraries from one of
these CDNs:

  1. You want to reduce bandwidth costs.
  2. You want to improve page load times by leveraging essentially a free CDN that (hopefully) moves content geographically closer to the end user.
  3. You want to try and take advantage of networks effects and have jquery already be in their cache.

So, do JavaScript CDNs actually do any of this?

For #1, yes, bandwidth will be reduced. The actual economic savings from
this may only applicable to fairly large websites however.

#2 is debatable. Creating an HTTP connection to a 3rd party is expensive:
DNS lookup, 3 way hand shake, slow start. This has even more impact for
mobile. Furthermore, as SPDY is adopted, I think the value of this goes
down, since the existing connection between the client and server is even
more efficiently used.

#3 is doubtful. The fragmentation of URLs to access these shared libraries
is just horrible (
http://statichtml.com/2011/google-ajax-libraries-caching.html). Combined
with small cache size of today's browsers I just don't think it is
reasonable to build a shared library system on top of HTTP caching.

Finally, remember we are jumping through all these crazy hoops just to save
~20KB. Put it into your build system and be done with it.

Billy Hoffman
Founder and CTO,
Zoompf - Making Websites Faster

@alfredxing
Copy link

I think we should use the CDN and add a note in the documentation that it can be removed.
This should be up to the user to decide, especially with regards to #2: if their website is geographically targeted to a local area (for example, a city), and their server is located within the area, it would be a good idea to host it locally. However, if their website is accessed globally, the CDN reference would be a good idea.

@tomByrer
Copy link

I think that zoompf makes some great points. Please let me add some additional thoughts

#1: Reducing bandwidth is also important for free/"unlimited" hosts, since they are not so reliable, esp if a blog post link blows up on reddit, etc.

#2: Yes, handshaking must be factored in. Then again, most browsers can do like 8 simultaneous connections?

#3: I would say #3 is very valid. (IIRC, I looked into using the same CDN jQuery version that Twitter used about a year ago, but now it seems they compress all their JavaScript into a handful of scripts & host those themselves; hitting a moving target is not ideal.)

Either way I'd say test test test.

But really who is the target market for html5-boilerplate anyway? Web folks who just want a simple framework to slap together a website really fast for a friend or want to build a page to showcase their new library they just wrote? Or a design team who have been contracted by a major corporation with deep pockets to rework their existing site?

I would think many if not most can change 1 line of code; so I think alfredxing's suggestion is the best.

@ghost
Copy link

ghost commented Mar 13, 2013

I disagree. I don't think we have enough data to show that, for a web
server in NY and a browser in San Jose, that the page load time is slower
when downloading that jquery from NY if it was combined with other
JavaScript libraries downloaded from NY. In other words, I haven't see real
world numbers showing that (jquery + other JS from NY) is slower than
(jquery from a local CDN node and everything else from NY). All the
"testing" discussed so far seems very academic under non-representative
conditions.

Billy Hoffman
Founder and CTO,
Zoompf - Making Websites Faster
Phone: +1 (404) 509-0051
Cell: +1 (404) 414-2000
Get your free web performance assessment today at zoompf.com/free

On Tue, Mar 12, 2013 at 5:07 PM, Alfred Xing notifications@github.comwrote:

I think we should use the CDN and add a note in the documentation that it
can be removed.
This should be up to the user to decide, especially with regards to #2#2:
if their website is geographically targeted to a local area (for example, a
city), and their server is located within the area, it would be a good idea
to host it locally. However, if their website is accessed globally, the CDN
reference would be a good idea.


Reply to this email directly or view it on GitHubhttps://github.com//pull/1327#issuecomment-14803346
.

@localpcguy
Copy link
Contributor Author

I can't speak for everyone @tomByrer, but my view of the target market for H5BP is devs who are looking for the current best practices. I don't view it as a "quick start" guide, necessarily, but rather as a great reference that the community keeps up to date with the latest best practices. That's just the way I see it, I don't speak for the maintainers, obviously.

@davidmurdoch
Copy link

I feel like our original discussion on the matter should be linked to for reference: #252.

TL;DR:

I think the resolution was that you should do whatever works best for you, as results may vary from project to project, depending on server location(s) and speed.

Run some A/B benchmarks on your sites then make an informed decision as to the optimal solution for each.

<speculation> I'd wager that most websites that include jQuery do not use a CDN for their static resources; I'm not talking about top sites, i.e., Alexa top 200, which often skew decisions like these, but your average mom-and-pop site.</speculation>
I wonder if Steve Souders has data on this? Any educated thoughts, @stevesouders (hopefully I'm pinging the right Steve Souders)?

@alfredxing
Copy link

I agree with @zoompf, @tomByrer, and @localpcguy on their latest comments.
However, I don't think Boilerplate has or should have a 'target market'. This is open source code and as such it should work best for everybody as an aggregate.

@Schepp
Copy link

Schepp commented Mar 13, 2013

@davidmurdoch Steve favors CDNs as can be read up here: http://www.stevesouders.com/blog/2011/08/17/http-archive-nine-months/

But he did never set up a real world test for this, as far as I know.

I argued against it in the comments basically using @zoompf's methodology of calculating the potential "hit rate" of a CDN regarding a pre-cached jQuery library. Which is low.

I am for including simple on the fly concatenation tools for a bunch of server side languages bundled up with H5BP. Could be limited to the mainstream languages (like you did include server side config for Apache and NGINX only). That would bring a lot of improvement to the average website, and due to the enormous number of those push the performance of the web a huge step forward.

And then leave the build tool in there as the better option for the experienced developers.

@anselmh
Copy link
Contributor

anselmh commented Mar 13, 2013

I do think H5BP should not force users using a build tool too much. Most web developers don't know how to deal with tools like grunt.js and they don't want to. Forcing them would lead them to other frameworks they actually are able to use without a build system.
It makes still sense to write about using a build tool in documentation and force the user to use it by writing down the advantages of it.

For the CDN/ non-CDN thing: I am not sure. For local sites it makes totally sense to serve local jQuery. For world-wide sites it still might make sense to use your own jQuery, especially when we're talking about the modular new jQuery. But again, this requires web developers to use build tools so we can say: H5BP users normally use it as is which means full jQuery.

What was left out in prior tests here: You cannot only test Chrome for HTTPRequests as this browser does a pretty good job here. Make sure you test it on IE and other older browsers who can't deal with 12 HTTP Requests or more in parallel.

I am in favor of leaving this to tools like Initializr who could offer a selection between CDN and non-CDNed jQuery. (But that's just my little notes here)

@Schepp
Copy link

Schepp commented Mar 13, 2013

@anselmh All browsers nowadays do 6 parallel requests per host on HTTP (HTTPS can be different). The difference between newer and older browsers are more subtle like DNS prefetching or better connection pool management.

Back to topic: On globally targeted sites you should rather use a CDN for all of your static stuff. And only one. Basically meaning that even here you should also concatenate first, and then you can have e.g. an origin pull CDN distribute the result for you.

The jQuery CDN only makes sense as being that one notch better than doing nothing. That's all the benefit I see.

I totally agree on your arguing that even the best build tool is still one step to much for the average developer.

@roblarsen
Copy link
Member

Discussion of bower here misses the point of the discussion, I think. The question is "what's the best default way to include jQuery for consumers of HTML5 Boilerplate and their users?" Saying "you should manage your dependencies with Bower" isn't really an answer to that question since it sidesteps the most important part of the equation- what's the fastest default way to serve jQuery to end users.

For now, unless something's changed in the past few months, that remains the Google CDN.

@Ganginator
Copy link

I agree with that 100%.
Is there a reason the main http://code.jquery.com/jquery-1.10.2.min.js CDN is not used, versus Google?
Either way, then document the last tested build, and advise the user to consider CDN versus a build?
Using the CDN's to get "out-of-the-box" functionality will make it easier to adapt by new people.

@roblarsen
Copy link
Member

If you're interested in this subject at a "why" level, there's a lot of detailed discussion of the issue in this very thread. Just start at the top and work your way down. The short answer is

  1. The Google CDN has much higher penetration and is, therefore, more likely to serve a cache-lottery winner
  2. The Google CDN supports https:// the jqury cdn does not

@tomByrer
Copy link

+1 for the jQuery being fetched by bower, tho.

Future jQ versions will have its own AMD loader. Though IMHO so many people use various jQurey plugins, I'm not sure if that will save much bandwidth.

Google CDN has much higher penetration and is, therefore, more likely to serve a cache-lottery winner

MAYBE 1.69%
Almost as good reason as your 2. :
3. Google has multiple PoPs in many many countries, including China IIRC.

@roblarsen
Copy link
Member

The question is a thin one. Why are we serving from Google and not some other CDN?

So, even though the odds aren't great (and pure penetration of any single library isn't the only factor) they're going to be better with Google than with someone else who is less popular. If you read through this thread, you'll see that I think the cache lottery is a canard. It's just that, if the cache lottery matters to you at all, the Google CDN is a much better option than anything else as it's got much higher penetration.

Of course, this is about the best default, not the best possible answer. The best possible answer can only be answered by the individual developer testing options on their individual site or application.

@pankajparashar-zz
Copy link

I think @roblarsen nailed it with this statement -

Of course, this is about the best default, not the best possible answer.

+1

@tomByrer
Copy link

The best possible answer can only be answered by the individual developer testing options on their individual site or application.

Yes, which is why I said "Test test test" before. And why we need to be aware of pros & cons.
As far as 'cache lottery', the technical reasons (number of PoPs, high uptime) are more likely going to weigh heavier. It could be worth looking into if a large percentage of your site's traffic is refereed from a single particular site, then share the same CDN & version, if that does not break your other code.

@roblarsen
Copy link
Member

Pinterest would be a great candidate for that technique

@tomByrer
Copy link

Pinterest would be a great candidate for that technique

Unfortunately, they don't seem to use jQuery.js at all, perhaps concocted. I was thinking more stackoverflow/StackExchange, who uses //ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js.
But then, one has to test their site if they change their jQuery version... might not be worth it.

@roblarsen
Copy link
Member

oh they used to. they did when we had this discussion originally.

@robwierzbowski
Copy link

...Most H5BP projects use jQuery via Google CDN. (#oroboros #selfFulfillingProphecy)

@ghost ghost assigned alrra Jan 13, 2014
@alrra
Copy link
Member

alrra commented Jan 19, 2014

build step

H5BP tries to be relatively agnostic of any high level development philosophy or framework, so that it can be used:

  • as a base on which other projects can be build upon
    (projects that can ultimately provide the build step)
  • even by less experienced developers
    (developers that don't use a build step or even know what that is)

Thank you all for your comments!

I'm sure there will be a similar discussion in the near future (especially as things such as SPDY / HTTP 2.0 get more traction), but for the time being, we'll stick with the CDN.

@alrra alrra closed this Jan 19, 2014
@roblarsen roblarsen mentioned this pull request Jan 23, 2014
roblarsen added a commit to roblarsen/html5-boilerplate that referenced this pull request Feb 24, 2014
As promised in h5bp#1498
Encapsulates the discussion from h5bp#1327
roblarsen added a commit to roblarsen/html5-boilerplate that referenced this pull request Feb 24, 2014
As promised in h5bp#1498
Encapsulates the discussion from h5bp#1327
roblarsen added a commit to roblarsen/html5-boilerplate that referenced this pull request Feb 24, 2014
As promised in h5bp#1498
Encapsulates the discussion from h5bp#1327
@alrra
Copy link
Member

alrra commented Apr 9, 2014

@tomByrer
Copy link

tomByrer commented Apr 9, 2014

Not really helpful article:

The root problem is that the last-mile latency of mobile carriers is atrocious - that's what we need to fix.

Like others & I said before; test if you want to tweak. The reason for
this thread is if the default (Google CDN) is sufficient, which it is.
I'm an optimization geek (which is why I help jsDelivr), but I don't think
worrying about POPs in the same rack as the mobile POPs really in the scope
of h5bp.

On Wed, Apr 9, 2014 at 11:49 AM, Cătălin Mariș notifications@github.comwrote:

@igrigorik https://github.com/igrigorik - Why is my CDN 'slow' for
mobile clients?http://www.igvita.com/2014/03/26/why-is-my-cdn-slow-for-mobile-clients/


Reply to this email directly or view it on GitHubhttps://github.com//pull/1327#issuecomment-39994615
.

@robwierzbowski
Copy link

Just to re-iterate: if all h5bp projects use google cdn for jQuery, and if
many sites use h5bp, we'll at least be making each other faster.

On Wednesday, April 9, 2014, tomByrer notifications@github.com wrote:

Not really helpful article:

The root problem is that the last-mile latency of mobile carriers is
atrocious - that's what we need to fix.

Like others & I said before; test if you want to tweak. The reason for
this thread is if the default (Google CDN) is sufficient, which it is.
I'm an optimization geek (which is why I help jsDelivr), but I don't think
worrying about POPs in the same rack as the mobile POPs really in the
scope
of h5bp.

On Wed, Apr 9, 2014 at 11:49 AM, Cătălin Mariș <notifications@github.comjavascript:_e(%7B%7D,'cvml','notifications@github.com');>wrote:

@igrigorik https://github.com/igrigorik - Why is my CDN 'slow' for
mobile clients?<
http://www.igvita.com/2014/03/26/why-is-my-cdn-slow-for-mobile-clients/>


Reply to this email directly or view it on GitHub<
https://github.com/h5bp/html5-boilerplate/pull/1327#issuecomment-39994615>

.


Reply to this email directly or view it on GitHubhttps://github.com//pull/1327#issuecomment-40003579
.

Rob Wierzbowski
@robwierzbowski http://twitter.com/#!/robwierzbowski
http://github.com/robwierzbowski
http://robwierzbowski.com

@tomByrer
Copy link

tomByrer commented Apr 9, 2014

@robwierzbowski You must have missed this post; expecting hitting the cache with same version is dismal, esp now 10+ releases later...?
#1327 (comment)

tomByrer added a commit to tomByrer/html5-boilerplate that referenced this pull request Apr 9, 2014
Perhaps the rise in [AngularJS](http://angularjs.org/)'s popularity be the real cause for the rise in popularity in ajax.googleapis.com?  Or maybe "[Sites using Google Libraries API](http://httparchive.org/trends.php#perGlibs)" rise is caused by "[Sites with Custom Fonts](http://httparchive.org/trends.php#perFonts)"?  Perhaps a revisit to see if [hitting the version lottery](h5bp#1327 (comment)) is in order, but I doubt it will prove it is worth mentioning "increases the odds of having a copy of the library in your user's browser cache".

Even though I'm involved with [jsDelivr CDN](http://www.jsdelivr.com/), and I see alot of [cdnjs](http://cdnjs.com/) usage in the wild, I conclude Google's CDN is the best to use for h5bp.  But please, an extra 1% chance of hitting cache isn't really worth it IMHO.
@robwierzbowski
Copy link

I have seen that and I really appreciate the research. I wonder if there is a way to track a segment of users and see how many times over the course of the day a they use the same version of jQuery between sites. It's possible user behavior could select for sites that have a smaller subset of jQuery versions (I'm a designer and I visit a lot of elitist web design blogs?), and I personally work on a small ecosystem of apps that can benefit from pulling the same version of jQuery between sites. And then there's cache lifetime: the longer a user's cache lives, the more likely they are to have a particular jQuery in their cache. How long does an average internet user need to surf until they have 80% of jQuery versions somewhere in their cache? The likeliness of a particular version being available is a curve that increases over time.

Thanks for pointing out the numbers again, and I completely agree that it's a lottery. But I'm not convinced that those numbers alone settle the "should we use a CDN" question.

@tomByrer
Copy link

tomByrer commented Apr 9, 2014

Seems with HTTP1.1, CDNs allow more "Max Connections" than local host alone (without sharding local, which is beyond scope of h5bp, & IMHO won't see the benefit of a separate server.).

The likeliness of a particular version being available is a curve that increases over time.

Perhaps 2 years ago, but devs don't upgrade versions much, so there is a huge range of versions out there. Plus IMHO script collation & there being 4-5 CDNs in use will reduce cache odds a bit more.

possible user behavior could select for sites that have a smaller subset of jQuery versions

I used to worry about that, until Pintrest dropped using Google's CDN (they now collate & CDN themselves). If you are the web dev for a network of sites that link together perhaps that can help. But if you're that big, you should really look into a caching CDN (eg Envato/Tuts+ uses CloudFlare).
Again, out of scope for this project.


Also, seems there is a bit of backlash against extra DNS lookups, but:

  • Thanks to other libs & versions using ajax.googleapis.com DNS entry is likely cached already
  • jQuery is a big lib; likely worth a DNS lookup.

dwick added a commit to reddit-archive/reddit that referenced this pull request Oct 20, 2014
Also removes "load core JS libraries from reddit servers" as
a preference since it is no longer needed.

Using the google cdn for jquery adds very little benefit, see:
http://statichtml.com/2011/google-ajax-libraries-caching.html
and
h5bp/html5-boilerplate#1327
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet