Remove Google CDN reference for jQuery #1327

Closed
wants to merge 1 commit into
from

Conversation

Projects
None yet
@localpcguy
Contributor

localpcguy commented Mar 6, 2013

http://statichtml.com/2011/google-ajax-libraries-caching.html

According to that, there is a very little benefit to using the
Google CDN to serve jQuery source. A better option would be to
encourage devs to minify and concatenate all of their script files
into one JS file served with a long-lived cache time.

This was referenced by Alex Sexton (@SlexAxton) in his talk at
jQueryTO March 2nd, 2013.

@localpcguy

This comment has been minimized.

Show comment
Hide comment
@localpcguy

localpcguy Mar 6, 2013

Contributor

Saw this was brought up once before but with different reasoning, seemed better to open a new issue rather than append to the older one. There may be other reasons to choose to use the CDN, but if it is strictly because it is considered better performance this article seems to indicate that won't be the case in the vast majority of the cases.

Contributor

localpcguy commented Mar 6, 2013

Saw this was brought up once before but with different reasoning, seemed better to open a new issue rather than append to the older one. There may be other reasons to choose to use the CDN, but if it is strictly because it is considered better performance this article seems to indicate that won't be the case in the vast majority of the cases.

@paulirish

This comment has been minimized.

Show comment
Hide comment
@paulirish

paulirish Mar 7, 2013

Member

Unless h5bp required a built step to use it, I don't think we can do this.
The best thing we can do is strongly promote build tools.

Also, looking forward: Bower + AMD + r.js will likely have a future role in solving this issue.

Member

paulirish commented Mar 7, 2013

Unless h5bp required a built step to use it, I don't think we can do this.
The best thing we can do is strongly promote build tools.

Also, looking forward: Bower + AMD + r.js will likely have a future role in solving this issue.

@necolas

This comment has been minimized.

Show comment
Hide comment
@necolas

necolas Mar 7, 2013

Member

I've brought this up before in other issues. I'm in favour of this change because even without a build step, using the CDN doesn't seem to provide significant benefit. But this change would need more work:

  • Remove the extra comment about "why we're not using the CDN".
  • Move mention of the reason to the JS docs file.
  • Include the unminified version of jQuery.
  • Include the jQuery source map file.
  • Include a line about the change in the CHANGELOG.
Member

necolas commented Mar 7, 2013

I've brought this up before in other issues. I'm in favour of this change because even without a build step, using the CDN doesn't seem to provide significant benefit. But this change would need more work:

  • Remove the extra comment about "why we're not using the CDN".
  • Move mention of the reason to the JS docs file.
  • Include the unminified version of jQuery.
  • Include the jQuery source map file.
  • Include a line about the change in the CHANGELOG.
@roblarsen

This comment has been minimized.

Show comment
Hide comment
@roblarsen

roblarsen Mar 7, 2013

Member

That's cool. That's great analysis and what I expected to see in terms of the spread of the URLs.

I think we ended up sticking with the CDN version, not because of the cache lottery, but because serving it from the Google CDN (faster than the average web server even if it's not cached) is a slightly better default than serving two files from one (slower) server- especially if there's a large geographical spread. Meaning, if we guaranteed that people were using a build tool the best default would be removing the CDN and serving an optimized single file (or some script loader solution- whatever), but since we can't guarantee that it's likely a slightly better default configuration assuming noth else happens after the files leave the repo.

For my money, the first thing i do is delete the link to the CDN, but I'm not the target for this particular line of code.

Member

roblarsen commented Mar 7, 2013

That's cool. That's great analysis and what I expected to see in terms of the spread of the URLs.

I think we ended up sticking with the CDN version, not because of the cache lottery, but because serving it from the Google CDN (faster than the average web server even if it's not cached) is a slightly better default than serving two files from one (slower) server- especially if there's a large geographical spread. Meaning, if we guaranteed that people were using a build tool the best default would be removing the CDN and serving an optimized single file (or some script loader solution- whatever), but since we can't guarantee that it's likely a slightly better default configuration assuming noth else happens after the files leave the repo.

For my money, the first thing i do is delete the link to the CDN, but I'm not the target for this particular line of code.

@roblarsen

This comment has been minimized.

Show comment
Hide comment
@roblarsen

roblarsen Mar 7, 2013

Member

Speaking of which, I should rerun the tests I ran a few years ago. wpt probably has a million more nodes now- more data 👍

Member

roblarsen commented Mar 7, 2013

Speaking of which, I should rerun the tests I ran a few years ago. wpt probably has a million more nodes now- more data 👍

@alfredxing

This comment has been minimized.

Show comment
Hide comment
@alfredxing

alfredxing Mar 7, 2013

Contributor

I like the Google CDN. It offers speed not dependent on the user's own hosting and server (so if they host a Boilerplate site on a crappy server, loading the jQuery won't take long). And Google's servers should be among the best in terms of uptime and latency.

Contributor

alfredxing commented Mar 7, 2013

I like the Google CDN. It offers speed not dependent on the user's own hosting and server (so if they host a Boilerplate site on a crappy server, loading the jQuery won't take long). And Google's servers should be among the best in terms of uptime and latency.

@roblarsen

This comment has been minimized.

Show comment
Hide comment
@roblarsen

roblarsen Mar 7, 2013

Member

One thing that analysis leaves out, that is interesting is a weighting towards the size of the sites using the CDN. For example, if a site like facebook was using a file from the CDN, the odds would be astronomically higher that their specific version would be cached even if they were hosting a unique version. So, while the simple distribution of URLs is interesting (and I think, telling) it's only a part of the picture. I mean, if you have a pinterest friendly audience your odds of http://ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js being in the cache are better than they would be otherwise.

Member

roblarsen commented Mar 7, 2013

One thing that analysis leaves out, that is interesting is a weighting towards the size of the sites using the CDN. For example, if a site like facebook was using a file from the CDN, the odds would be astronomically higher that their specific version would be cached even if they were hosting a unique version. So, while the simple distribution of URLs is interesting (and I think, telling) it's only a part of the picture. I mean, if you have a pinterest friendly audience your odds of http://ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js being in the cache are better than they would be otherwise.

@SlexAxton

This comment has been minimized.

Show comment
Hide comment
@SlexAxton

SlexAxton Mar 7, 2013

Contributor

In the original write up by @zoompf - he mentions that even against your server, in an uncached state, the overhead of an http request doesn't beat just building it into the same file. And further, the DNS lookup to google adds to the issue. The fastest downloads of jQuery in every test I've seen have always been critical pageload resources all in a single file asynchronously in the head. This means that you don't have multiple http requests, and you don't have an additional DNS resolution (which is slow based partly on your provider, not just based on how fast google is).

Contributor

SlexAxton commented Mar 7, 2013

In the original write up by @zoompf - he mentions that even against your server, in an uncached state, the overhead of an http request doesn't beat just building it into the same file. And further, the DNS lookup to google adds to the issue. The fastest downloads of jQuery in every test I've seen have always been critical pageload resources all in a single file asynchronously in the head. This means that you don't have multiple http requests, and you don't have an additional DNS resolution (which is slow based partly on your provider, not just based on how fast google is).

@roblarsen

This comment has been minimized.

Show comment
Hide comment
@roblarsen

roblarsen Mar 7, 2013

Member

I quickly ran some tests with wpt (10 runs in chrome). The default results hold. If you do nothing to the files and just serve them as is (multiple script files), using the Google CDN is faster by default, especially if you have a global audience. The geographical advantage that Google brings continues to matter.

location            one server    google cdn
dulles, usa         1.651         1.275
jiangsu, china      3.484         2.357
sao paolo           2.663         2.019
delhi               4.298         2.977
LA                  2.108         1.74
-------------------------------------------------------------------------
AVG                 2.8408        2.0736

Of course, we don't want people serving multiple files. But if they do, based on this slice, serving it off of the CDN is faster, by default.

Member

roblarsen commented Mar 7, 2013

I quickly ran some tests with wpt (10 runs in chrome). The default results hold. If you do nothing to the files and just serve them as is (multiple script files), using the Google CDN is faster by default, especially if you have a global audience. The geographical advantage that Google brings continues to matter.

location            one server    google cdn
dulles, usa         1.651         1.275
jiangsu, china      3.484         2.357
sao paolo           2.663         2.019
delhi               4.298         2.977
LA                  2.108         1.74
-------------------------------------------------------------------------
AVG                 2.8408        2.0736

Of course, we don't want people serving multiple files. But if they do, based on this slice, serving it off of the CDN is faster, by default.

@amenadiel

This comment has been minimized.

Show comment
Hide comment
@amenadiel

amenadiel Mar 7, 2013

I've tried many options available including a bunch of scripts, concatenating them, and async loading them with RequireJS, head.js and lazyload.js

Much to my surprise, the best ratio of speed/not-breaking-the-hell-outta-the-layout was with google js API which makes me stick to cdn's version 1.7.1

I've tried many options available including a bunch of scripts, concatenating them, and async loading them with RequireJS, head.js and lazyload.js

Much to my surprise, the best ratio of speed/not-breaking-the-hell-outta-the-layout was with google js API which makes me stick to cdn's version 1.7.1

@SlexAxton

This comment has been minimized.

Show comment
Hide comment
@SlexAxton

SlexAxton Mar 7, 2013

Contributor

I'm not sure common research agrees with your results. Do you have some
benchmarks we could see? What does 1.7.1 have yo do with this?

I'm not terribly interested in measuring the total speed of a site with no
builds. It's a pretty easy win to put your files together and if you aren't
doing that, you likely don't really care about optimizing your speed.

I think the project should default to making it the easiest to build files
together in production.

Contributor

SlexAxton commented Mar 7, 2013

I'm not sure common research agrees with your results. Do you have some
benchmarks we could see? What does 1.7.1 have yo do with this?

I'm not terribly interested in measuring the total speed of a site with no
builds. It's a pretty easy win to put your files together and if you aren't
doing that, you likely don't really care about optimizing your speed.

I think the project should default to making it the easiest to build files
together in production.

@roblarsen

This comment has been minimized.

Show comment
Hide comment
@roblarsen

roblarsen Mar 7, 2013

Member

I'm not sure common research agrees with your results. Do you have some
benchmarks we could see? What does 1.7.1 have yo do with this?

1.7.1 is the version of jQuery on the Ajax CDN used by Pinterest? Beetlejuice?

Member

roblarsen commented Mar 7, 2013

I'm not sure common research agrees with your results. Do you have some
benchmarks we could see? What does 1.7.1 have yo do with this?

1.7.1 is the version of jQuery on the Ajax CDN used by Pinterest? Beetlejuice?

Removed Google CDN reference for jQuery
http://statichtml.com/2011/google-ajax-libraries-caching.html

According to that, there is a very little benefit to using the
Google CDN to serve jQuery source.  A better option would be to
encourage devs to minify and concatenate all of their script files
into one JS file served with a long-lived cache time.

This was referenced by Alex Sexton (@SlexAxton) in his talk at
jQueryTO March 2nd, 2013.

Also:
* Included the unminified version of jQuery (in the /js/vendors folder)
* Included the jQuery source map file (in the /js/vendors folder)
* Included the changes in the CHANGELOG

@localpcguy localpcguy closed this Mar 7, 2013

@localpcguy localpcguy reopened this Mar 7, 2013

@localpcguy

This comment has been minimized.

Show comment
Hide comment
@localpcguy

localpcguy Mar 7, 2013

Contributor

I updated the file to more closely match what @necolas listed as needed. Not pushing hard for this to be included (lack of build script seems to be a pretty valid reason not to) but I do think it needs to be considered.

Contributor

localpcguy commented Mar 7, 2013

I updated the file to more closely match what @necolas listed as needed. Not pushing hard for this to be included (lack of build script seems to be a pretty valid reason not to) but I do think it needs to be considered.

@amenadiel

This comment has been minimized.

Show comment
Hide comment
@amenadiel

amenadiel Mar 7, 2013

@localpcguy I guess my explanation was too brief. What I meant is:

  1. Serving all my scripts minified together with jQuery brought no noticeable speed increase over requesting Google's CDN jquery and my scripts separately
  2. Asynchronous loading, in turn, brought performance increase but
  3. Most popular async loaders, including Require JS with-bundled-jquery, HeadJS and LazyLoadJS broke my layout. I don't know why and after a few hours of debugging I decided to try Google JS API and
  4. Google JS brought me the speed of async loading and didn't break my layout, but you depend on what version of jQuery will the API offer. In my case, when I request for version 1 (as in "latest release from 1.x series") Google sends you to 1.7.1. You can manually choose another one, but you'll have to cope with shorter cache headers.
<script type="text/javascript" src="//www.google.com/jsapi"></script>
<script type="text/javascript">
//<![CDATA[
    google.load("jquery", "1");
//]]>
</script>

So yes, I agree with you in that the plain loading from Google CDN isn't the best approach, but I'm throwing in my two cents: on a Linode VPS the overhead of serving jQuery myself is above any performance benefit, and instead I'll reccomend an async loader.

@localpcguy I guess my explanation was too brief. What I meant is:

  1. Serving all my scripts minified together with jQuery brought no noticeable speed increase over requesting Google's CDN jquery and my scripts separately
  2. Asynchronous loading, in turn, brought performance increase but
  3. Most popular async loaders, including Require JS with-bundled-jquery, HeadJS and LazyLoadJS broke my layout. I don't know why and after a few hours of debugging I decided to try Google JS API and
  4. Google JS brought me the speed of async loading and didn't break my layout, but you depend on what version of jQuery will the API offer. In my case, when I request for version 1 (as in "latest release from 1.x series") Google sends you to 1.7.1. You can manually choose another one, but you'll have to cope with shorter cache headers.
<script type="text/javascript" src="//www.google.com/jsapi"></script>
<script type="text/javascript">
//<![CDATA[
    google.load("jquery", "1");
//]]>
</script>

So yes, I agree with you in that the plain loading from Google CDN isn't the best approach, but I'm throwing in my two cents: on a Linode VPS the overhead of serving jQuery myself is above any performance benefit, and instead I'll reccomend an async loader.

@localpcguy

This comment has been minimized.

Show comment
Hide comment
@localpcguy

localpcguy Mar 7, 2013

Contributor

@amenadiel No, I understood what you were saying. I'm just not sure that your example isn't unique to your specific setup as other testing has shown otherwise. And aren't you making 2 requests then to load jQuery (for the Google code and then again for the jQuery code itself)? Fine if you plan to utilize the Google Loader for other things, but probably not best standard practice.

Also, as an aside, you should always request a specific version of jQuery rather than the 1.x "latest", because of 1) the potential for breakage following any breaking changes when your site gets auto-updated to the latest jQuery. And 2) you also won't get a long lived cached version, which is one of the "benefits" being discussed here as regards loading scripts from the jQuery CDN using a standard script tag and a specific version. (See http://paulirish.com/2009/caching-and-googles-ajax-libraries-api/ - athough that is a few years old now, my understanding is the caching information is still valid.)

Contributor

localpcguy commented Mar 7, 2013

@amenadiel No, I understood what you were saying. I'm just not sure that your example isn't unique to your specific setup as other testing has shown otherwise. And aren't you making 2 requests then to load jQuery (for the Google code and then again for the jQuery code itself)? Fine if you plan to utilize the Google Loader for other things, but probably not best standard practice.

Also, as an aside, you should always request a specific version of jQuery rather than the 1.x "latest", because of 1) the potential for breakage following any breaking changes when your site gets auto-updated to the latest jQuery. And 2) you also won't get a long lived cached version, which is one of the "benefits" being discussed here as regards loading scripts from the jQuery CDN using a standard script tag and a specific version. (See http://paulirish.com/2009/caching-and-googles-ajax-libraries-api/ - athough that is a few years old now, my understanding is the caching information is still valid.)

@tomByrer

This comment has been minimized.

Show comment
Hide comment
@tomByrer

tomByrer Mar 12, 2013

To bounce off the research from @roblarsen
StackExchange.com uses http://ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js also (& they rarely upgrade).
GuardianNews.com uses http://ajax.googleapis.com/ajax/libs/jquery/1.8.1/jquery.min.js (I think they did a refresh recently, so I don't see an upgrade coming soon)

Microsoft.com uses http://ajax.aspnetcdn.com/ajax/jQuery/jquery-1.7.2.min.js as well as Skype.com (MS allows public use of aspnetcdn)
Hat tip: http://w3techs.com/technologies/details/js-jquery/all/all

In the end, I have to agree with @paulirish build tools are the best bet, though Google/MS CDNs are the safest bet.

To bounce off the research from @roblarsen
StackExchange.com uses http://ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js also (& they rarely upgrade).
GuardianNews.com uses http://ajax.googleapis.com/ajax/libs/jquery/1.8.1/jquery.min.js (I think they did a refresh recently, so I don't see an upgrade coming soon)

Microsoft.com uses http://ajax.aspnetcdn.com/ajax/jQuery/jquery-1.7.2.min.js as well as Skype.com (MS allows public use of aspnetcdn)
Hat tip: http://w3techs.com/technologies/details/js-jquery/all/all

In the end, I have to agree with @paulirish build tools are the best bet, though Google/MS CDNs are the safest bet.

@roblarsen

This comment has been minimized.

Show comment
Hide comment
@roblarsen

roblarsen Mar 12, 2013

Member

FWIW, I'm working with one of my old colleagues from sapient and looking at this beyond pure distribution in the HTTP Archive dataset. Hopefully something interesting will shake out.

Member

roblarsen commented Mar 12, 2013

FWIW, I'm working with one of my old colleagues from sapient and looking at this beyond pure distribution in the HTTP Archive dataset. Hopefully something interesting will shake out.

@ghost

This comment has been minimized.

Show comment
Hide comment
@ghost

ghost Mar 12, 2013

Lets consider a more fundamental analysis. I can think of 3 reasons for a
website wanting to download jquery or other common libraries from one of
these CDNs:

  1. You want to reduce bandwidth costs.
  2. You want to improve page load times by leveraging essentially a free CDN that (hopefully) moves content geographically closer to the end user.
  3. You want to try and take advantage of networks effects and have jquery already be in their cache.

So, do JavaScript CDNs actually do any of this?

For #1, yes, bandwidth will be reduced. The actual economic savings from
this may only applicable to fairly large websites however.

#2 is debatable. Creating an HTTP connection to a 3rd party is expensive:
DNS lookup, 3 way hand shake, slow start. This has even more impact for
mobile. Furthermore, as SPDY is adopted, I think the value of this goes
down, since the existing connection between the client and server is even
more efficiently used.

#3 is doubtful. The fragmentation of URLs to access these shared libraries
is just horrible (
http://statichtml.com/2011/google-ajax-libraries-caching.html). Combined
with small cache size of today's browsers I just don't think it is
reasonable to build a shared library system on top of HTTP caching.

Finally, remember we are jumping through all these crazy hoops just to save
~20KB. Put it into your build system and be done with it.

Billy Hoffman
Founder and CTO,
Zoompf - Making Websites Faster

ghost commented Mar 12, 2013

Lets consider a more fundamental analysis. I can think of 3 reasons for a
website wanting to download jquery or other common libraries from one of
these CDNs:

  1. You want to reduce bandwidth costs.
  2. You want to improve page load times by leveraging essentially a free CDN that (hopefully) moves content geographically closer to the end user.
  3. You want to try and take advantage of networks effects and have jquery already be in their cache.

So, do JavaScript CDNs actually do any of this?

For #1, yes, bandwidth will be reduced. The actual economic savings from
this may only applicable to fairly large websites however.

#2 is debatable. Creating an HTTP connection to a 3rd party is expensive:
DNS lookup, 3 way hand shake, slow start. This has even more impact for
mobile. Furthermore, as SPDY is adopted, I think the value of this goes
down, since the existing connection between the client and server is even
more efficiently used.

#3 is doubtful. The fragmentation of URLs to access these shared libraries
is just horrible (
http://statichtml.com/2011/google-ajax-libraries-caching.html). Combined
with small cache size of today's browsers I just don't think it is
reasonable to build a shared library system on top of HTTP caching.

Finally, remember we are jumping through all these crazy hoops just to save
~20KB. Put it into your build system and be done with it.

Billy Hoffman
Founder and CTO,
Zoompf - Making Websites Faster

@alfredxing

This comment has been minimized.

Show comment
Hide comment
@alfredxing

alfredxing Mar 12, 2013

Contributor

I think we should use the CDN and add a note in the documentation that it can be removed.
This should be up to the user to decide, especially with regards to #2: if their website is geographically targeted to a local area (for example, a city), and their server is located within the area, it would be a good idea to host it locally. However, if their website is accessed globally, the CDN reference would be a good idea.

Contributor

alfredxing commented Mar 12, 2013

I think we should use the CDN and add a note in the documentation that it can be removed.
This should be up to the user to decide, especially with regards to #2: if their website is geographically targeted to a local area (for example, a city), and their server is located within the area, it would be a good idea to host it locally. However, if their website is accessed globally, the CDN reference would be a good idea.

@tomByrer

This comment has been minimized.

Show comment
Hide comment
@tomByrer

tomByrer Mar 13, 2013

I think that zoompf makes some great points. Please let me add some additional thoughts

#1: Reducing bandwidth is also important for free/"unlimited" hosts, since they are not so reliable, esp if a blog post link blows up on reddit, etc.

#2: Yes, handshaking must be factored in. Then again, most browsers can do like 8 simultaneous connections?

#3: I would say #3 is very valid. (IIRC, I looked into using the same CDN jQuery version that Twitter used about a year ago, but now it seems they compress all their JavaScript into a handful of scripts & host those themselves; hitting a moving target is not ideal.)

Either way I'd say test test test.

But really who is the target market for html5-boilerplate anyway? Web folks who just want a simple framework to slap together a website really fast for a friend or want to build a page to showcase their new library they just wrote? Or a design team who have been contracted by a major corporation with deep pockets to rework their existing site?

I would think many if not most can change 1 line of code; so I think alfredxing's suggestion is the best.

I think that zoompf makes some great points. Please let me add some additional thoughts

#1: Reducing bandwidth is also important for free/"unlimited" hosts, since they are not so reliable, esp if a blog post link blows up on reddit, etc.

#2: Yes, handshaking must be factored in. Then again, most browsers can do like 8 simultaneous connections?

#3: I would say #3 is very valid. (IIRC, I looked into using the same CDN jQuery version that Twitter used about a year ago, but now it seems they compress all their JavaScript into a handful of scripts & host those themselves; hitting a moving target is not ideal.)

Either way I'd say test test test.

But really who is the target market for html5-boilerplate anyway? Web folks who just want a simple framework to slap together a website really fast for a friend or want to build a page to showcase their new library they just wrote? Or a design team who have been contracted by a major corporation with deep pockets to rework their existing site?

I would think many if not most can change 1 line of code; so I think alfredxing's suggestion is the best.

@ghost

This comment has been minimized.

Show comment
Hide comment
@ghost

ghost Mar 13, 2013

I disagree. I don't think we have enough data to show that, for a web
server in NY and a browser in San Jose, that the page load time is slower
when downloading that jquery from NY if it was combined with other
JavaScript libraries downloaded from NY. In other words, I haven't see real
world numbers showing that (jquery + other JS from NY) is slower than
(jquery from a local CDN node and everything else from NY). All the
"testing" discussed so far seems very academic under non-representative
conditions.

Billy Hoffman
Founder and CTO,
Zoompf - Making Websites Faster
Phone: +1 (404) 509-0051
Cell: +1 (404) 414-2000
Get your free web performance assessment today at zoompf.com/free

On Tue, Mar 12, 2013 at 5:07 PM, Alfred Xing notifications@github.comwrote:

I think we should use the CDN and add a note in the documentation that it
can be removed.
This should be up to the user to decide, especially with regards to #2#2:
if their website is geographically targeted to a local area (for example, a
city), and their server is located within the area, it would be a good idea
to host it locally. However, if their website is accessed globally, the CDN
reference would be a good idea.


Reply to this email directly or view it on GitHubhttps://github.com/h5bp/html5-boilerplate/pull/1327#issuecomment-14803346
.

ghost commented Mar 13, 2013

I disagree. I don't think we have enough data to show that, for a web
server in NY and a browser in San Jose, that the page load time is slower
when downloading that jquery from NY if it was combined with other
JavaScript libraries downloaded from NY. In other words, I haven't see real
world numbers showing that (jquery + other JS from NY) is slower than
(jquery from a local CDN node and everything else from NY). All the
"testing" discussed so far seems very academic under non-representative
conditions.

Billy Hoffman
Founder and CTO,
Zoompf - Making Websites Faster
Phone: +1 (404) 509-0051
Cell: +1 (404) 414-2000
Get your free web performance assessment today at zoompf.com/free

On Tue, Mar 12, 2013 at 5:07 PM, Alfred Xing notifications@github.comwrote:

I think we should use the CDN and add a note in the documentation that it
can be removed.
This should be up to the user to decide, especially with regards to #2#2:
if their website is geographically targeted to a local area (for example, a
city), and their server is located within the area, it would be a good idea
to host it locally. However, if their website is accessed globally, the CDN
reference would be a good idea.


Reply to this email directly or view it on GitHubhttps://github.com/h5bp/html5-boilerplate/pull/1327#issuecomment-14803346
.

@localpcguy

This comment has been minimized.

Show comment
Hide comment
@localpcguy

localpcguy Mar 13, 2013

Contributor

I can't speak for everyone @tomByrer, but my view of the target market for H5BP is devs who are looking for the current best practices. I don't view it as a "quick start" guide, necessarily, but rather as a great reference that the community keeps up to date with the latest best practices. That's just the way I see it, I don't speak for the maintainers, obviously.

Contributor

localpcguy commented Mar 13, 2013

I can't speak for everyone @tomByrer, but my view of the target market for H5BP is devs who are looking for the current best practices. I don't view it as a "quick start" guide, necessarily, but rather as a great reference that the community keeps up to date with the latest best practices. That's just the way I see it, I don't speak for the maintainers, obviously.

@davidmurdoch

This comment has been minimized.

Show comment
Hide comment
@davidmurdoch

davidmurdoch Mar 13, 2013

Contributor

I feel like our original discussion on the matter should be linked to for reference: #252.

TL;DR:

I think the resolution was that you should do whatever works best for you, as results may vary from project to project, depending on server location(s) and speed.

Run some A/B benchmarks on your sites then make an informed decision as to the optimal solution for each.

<speculation> I'd wager that most websites that include jQuery do not use a CDN for their static resources; I'm not talking about top sites, i.e., Alexa top 200, which often skew decisions like these, but your average mom-and-pop site.</speculation>
I wonder if Steve Souders has data on this? Any educated thoughts, @stevesouders (hopefully I'm pinging the right Steve Souders)?

Contributor

davidmurdoch commented Mar 13, 2013

I feel like our original discussion on the matter should be linked to for reference: #252.

TL;DR:

I think the resolution was that you should do whatever works best for you, as results may vary from project to project, depending on server location(s) and speed.

Run some A/B benchmarks on your sites then make an informed decision as to the optimal solution for each.

<speculation> I'd wager that most websites that include jQuery do not use a CDN for their static resources; I'm not talking about top sites, i.e., Alexa top 200, which often skew decisions like these, but your average mom-and-pop site.</speculation>
I wonder if Steve Souders has data on this? Any educated thoughts, @stevesouders (hopefully I'm pinging the right Steve Souders)?

@alfredxing

This comment has been minimized.

Show comment
Hide comment
@alfredxing

alfredxing Mar 13, 2013

Contributor

I agree with @zoompf, @tomByrer, and @localpcguy on their latest comments.
However, I don't think Boilerplate has or should have a 'target market'. This is open source code and as such it should work best for everybody as an aggregate.

Contributor

alfredxing commented Mar 13, 2013

I agree with @zoompf, @tomByrer, and @localpcguy on their latest comments.
However, I don't think Boilerplate has or should have a 'target market'. This is open source code and as such it should work best for everybody as an aggregate.

@Schepp

This comment has been minimized.

Show comment
Hide comment
@Schepp

Schepp Mar 13, 2013

@davidmurdoch Steve favors CDNs as can be read up here: http://www.stevesouders.com/blog/2011/08/17/http-archive-nine-months/

But he did never set up a real world test for this, as far as I know.

I argued against it in the comments basically using @zoompf's methodology of calculating the potential "hit rate" of a CDN regarding a pre-cached jQuery library. Which is low.

I am for including simple on the fly concatenation tools for a bunch of server side languages bundled up with H5BP. Could be limited to the mainstream languages (like you did include server side config for Apache and NGINX only). That would bring a lot of improvement to the average website, and due to the enormous number of those push the performance of the web a huge step forward.

And then leave the build tool in there as the better option for the experienced developers.

Schepp commented Mar 13, 2013

@davidmurdoch Steve favors CDNs as can be read up here: http://www.stevesouders.com/blog/2011/08/17/http-archive-nine-months/

But he did never set up a real world test for this, as far as I know.

I argued against it in the comments basically using @zoompf's methodology of calculating the potential "hit rate" of a CDN regarding a pre-cached jQuery library. Which is low.

I am for including simple on the fly concatenation tools for a bunch of server side languages bundled up with H5BP. Could be limited to the mainstream languages (like you did include server side config for Apache and NGINX only). That would bring a lot of improvement to the average website, and due to the enormous number of those push the performance of the web a huge step forward.

And then leave the build tool in there as the better option for the experienced developers.

@anselmh

This comment has been minimized.

Show comment
Hide comment
@anselmh

anselmh Mar 13, 2013

Contributor

I do think H5BP should not force users using a build tool too much. Most web developers don't know how to deal with tools like grunt.js and they don't want to. Forcing them would lead them to other frameworks they actually are able to use without a build system.
It makes still sense to write about using a build tool in documentation and force the user to use it by writing down the advantages of it.

For the CDN/ non-CDN thing: I am not sure. For local sites it makes totally sense to serve local jQuery. For world-wide sites it still might make sense to use your own jQuery, especially when we're talking about the modular new jQuery. But again, this requires web developers to use build tools so we can say: H5BP users normally use it as is which means full jQuery.

What was left out in prior tests here: You cannot only test Chrome for HTTPRequests as this browser does a pretty good job here. Make sure you test it on IE and other older browsers who can't deal with 12 HTTP Requests or more in parallel.

I am in favor of leaving this to tools like Initializr who could offer a selection between CDN and non-CDNed jQuery. (But that's just my little notes here)

Contributor

anselmh commented Mar 13, 2013

I do think H5BP should not force users using a build tool too much. Most web developers don't know how to deal with tools like grunt.js and they don't want to. Forcing them would lead them to other frameworks they actually are able to use without a build system.
It makes still sense to write about using a build tool in documentation and force the user to use it by writing down the advantages of it.

For the CDN/ non-CDN thing: I am not sure. For local sites it makes totally sense to serve local jQuery. For world-wide sites it still might make sense to use your own jQuery, especially when we're talking about the modular new jQuery. But again, this requires web developers to use build tools so we can say: H5BP users normally use it as is which means full jQuery.

What was left out in prior tests here: You cannot only test Chrome for HTTPRequests as this browser does a pretty good job here. Make sure you test it on IE and other older browsers who can't deal with 12 HTTP Requests or more in parallel.

I am in favor of leaving this to tools like Initializr who could offer a selection between CDN and non-CDNed jQuery. (But that's just my little notes here)

@Schepp

This comment has been minimized.

Show comment
Hide comment
@Schepp

Schepp Mar 13, 2013

@anselmh All browsers nowadays do 6 parallel requests per host on HTTP (HTTPS can be different). The difference between newer and older browsers are more subtle like DNS prefetching or better connection pool management.

Back to topic: On globally targeted sites you should rather use a CDN for all of your static stuff. And only one. Basically meaning that even here you should also concatenate first, and then you can have e.g. an origin pull CDN distribute the result for you.

The jQuery CDN only makes sense as being that one notch better than doing nothing. That's all the benefit I see.

I totally agree on your arguing that even the best build tool is still one step to much for the average developer.

Schepp commented Mar 13, 2013

@anselmh All browsers nowadays do 6 parallel requests per host on HTTP (HTTPS can be different). The difference between newer and older browsers are more subtle like DNS prefetching or better connection pool management.

Back to topic: On globally targeted sites you should rather use a CDN for all of your static stuff. And only one. Basically meaning that even here you should also concatenate first, and then you can have e.g. an origin pull CDN distribute the result for you.

The jQuery CDN only makes sense as being that one notch better than doing nothing. That's all the benefit I see.

I totally agree on your arguing that even the best build tool is still one step to much for the average developer.

@tomByrer

This comment has been minimized.

Show comment
Hide comment
@tomByrer

tomByrer Mar 13, 2013

Just found a solid reference for parallel requests; seems the newest major browsers can actually handle 9-17 MAX connections (by default, give/take). However, if you limit to the same hostname, you are down to 6-8 connections. http://www.browserscope.org/?category=network
Noteable: IE actually has the greatest # of (default) MAX connections. Seems 8 & 9 has more than 10.

I guess one can trick a browser into using MAX connections by using www.myslowhost.com, img.myslowhost.com, js.myslowhost.com, etc. Though IMHO an alt CDN is more ideal. Most ideal is a single concocted & compressed file like Schepp & others suggested (I think we all agree?). For the default H5PB, my gut feeling is still use Google's CDN by default, perhaps with a comment that links to better tools &/or docs that someday will be here?

Just found a solid reference for parallel requests; seems the newest major browsers can actually handle 9-17 MAX connections (by default, give/take). However, if you limit to the same hostname, you are down to 6-8 connections. http://www.browserscope.org/?category=network
Noteable: IE actually has the greatest # of (default) MAX connections. Seems 8 & 9 has more than 10.

I guess one can trick a browser into using MAX connections by using www.myslowhost.com, img.myslowhost.com, js.myslowhost.com, etc. Though IMHO an alt CDN is more ideal. Most ideal is a single concocted & compressed file like Schepp & others suggested (I think we all agree?). For the default H5PB, my gut feeling is still use Google's CDN by default, perhaps with a comment that links to better tools &/or docs that someday will be here?

@davidmurdoch

This comment has been minimized.

Show comment
Hide comment
@davidmurdoch

davidmurdoch Mar 13, 2013

Contributor

The first paragraph of Steve Webster's conclusion is a non-sequitur; just because there is a slim chance of hitting the cache lottery does not negate the benefits of Google's CDN. The benefits of using Google's CDN are (free) geographic dispersion, mitigating bandwidth, and parallelization — the cache is just a bonus.

Contributor

davidmurdoch commented Mar 13, 2013

The first paragraph of Steve Webster's conclusion is a non-sequitur; just because there is a slim chance of hitting the cache lottery does not negate the benefits of Google's CDN. The benefits of using Google's CDN are (free) geographic dispersion, mitigating bandwidth, and parallelization — the cache is just a bonus.

@wordgame

This comment has been minimized.

Show comment
Hide comment
@wordgame

wordgame Mar 13, 2013

Another thing to consider is that due to the widespread usage of using jQuery from the Google CDN, most people already have a various copies and versions of jQuery in their cache, ready to be retrieved locally, meaning that no request will be needed at all to get jQuery when they come to your site.

+1 for keeping Google CDN jQuery.

Another thing to consider is that due to the widespread usage of using jQuery from the Google CDN, most people already have a various copies and versions of jQuery in their cache, ready to be retrieved locally, meaning that no request will be needed at all to get jQuery when they come to your site.

+1 for keeping Google CDN jQuery.

@Roope

This comment has been minimized.

Show comment
Hide comment
@Roope

Roope Mar 13, 2013

Amen to this wordgame!

Isn't this the whole point of CDN hosted libraries?

Roope commented Mar 13, 2013

Amen to this wordgame!

Isn't this the whole point of CDN hosted libraries?

@sarukuku

This comment has been minimized.

Show comment
Hide comment
@sarukuku

sarukuku Mar 13, 2013

+1 for keeping the Google CDN jQuery

+1 for keeping the Google CDN jQuery

@davidmurdoch

This comment has been minimized.

Show comment
Hide comment
@davidmurdoch

davidmurdoch Mar 13, 2013

Contributor

@wordgame, you should read the article in the first post.

Contributor

davidmurdoch commented Mar 13, 2013

@wordgame, you should read the article in the first post.

@Schepp

This comment has been minimized.

Show comment
Hide comment
@Schepp

Schepp Mar 13, 2013

@wordgame @Roope @sarukuku Yeah, please read the article. We try to keep this area facepalm- free ;)

Schepp commented Mar 13, 2013

@wordgame @Roope @sarukuku Yeah, please read the article. We try to keep this area facepalm- free ;)

@kdimatteo

This comment has been minimized.

Show comment
Hide comment
@kdimatteo

kdimatteo Mar 13, 2013

@wordgame, @sarukuku This is exactly what we're trying to measure -- the chances of hitting the cache lottery.

Preliminary research into from data collected by httparchive (http://httparchive.org/) indicates that v1.9.1 (which is used in the boilerplate) ranks very low.

In fact, based on requests from the top 1M sites, the most common version of jQuery is 1.4.2 (supports the article shared by @localpcguy)

@roblarsen and I are working on some additional detail and possibility of including cache-lifespan to get a clearer picture of what the cache-lottery odds are.

@wordgame, @sarukuku This is exactly what we're trying to measure -- the chances of hitting the cache lottery.

Preliminary research into from data collected by httparchive (http://httparchive.org/) indicates that v1.9.1 (which is used in the boilerplate) ranks very low.

In fact, based on requests from the top 1M sites, the most common version of jQuery is 1.4.2 (supports the article shared by @localpcguy)

@roblarsen and I are working on some additional detail and possibility of including cache-lifespan to get a clearer picture of what the cache-lottery odds are.

@cowboy

This comment has been minimized.

Show comment
Hide comment
@cowboy

cowboy Mar 13, 2013

+1 the concept of static JS hosted by CDN is incredibly dated.

cowboy commented Mar 13, 2013

+1 the concept of static JS hosted by CDN is incredibly dated.

@roblarsen

This comment has been minimized.

Show comment
Hide comment
@roblarsen

roblarsen Mar 13, 2013

Member

The benefits of using Google's CDN are (free) geographic dispersion, mitigating bandwidth, and parallelization — the cache is just a bonus.

To this end and since the question here is what's the best default can we just test this thing? Assuming the cache lottery is a non-issue (and hopefully @kdimatteo and I can build on the article linked in the thread to put it completely to bed) we can still test the two states to see what we're actually talking about. I shared some numbers but people in the community have access to many servers in many geographical locations to put some numbers to this. No more finger in the air to see which way the wind blows.

We could test three things:

  1. The default- loading jQuery from the Google CDN
  2. Loading jQuery from the server
  3. Loading one, concatenated file from the server (since this is the final state we're advocating people reach)

We could also test
4. serving one, concatenated file from a CDN, since that should be the best of both worlds.

Test with jSperf and save the results in a spreadsheet on google docs (or we could be fancy and do a Google Analytics web timing thing)

Member

roblarsen commented Mar 13, 2013

The benefits of using Google's CDN are (free) geographic dispersion, mitigating bandwidth, and parallelization — the cache is just a bonus.

To this end and since the question here is what's the best default can we just test this thing? Assuming the cache lottery is a non-issue (and hopefully @kdimatteo and I can build on the article linked in the thread to put it completely to bed) we can still test the two states to see what we're actually talking about. I shared some numbers but people in the community have access to many servers in many geographical locations to put some numbers to this. No more finger in the air to see which way the wind blows.

We could test three things:

  1. The default- loading jQuery from the Google CDN
  2. Loading jQuery from the server
  3. Loading one, concatenated file from the server (since this is the final state we're advocating people reach)

We could also test
4. serving one, concatenated file from a CDN, since that should be the best of both worlds.

Test with jSperf and save the results in a spreadsheet on google docs (or we could be fancy and do a Google Analytics web timing thing)

@Garbee

This comment has been minimized.

Show comment
Hide comment
@Garbee

Garbee Jun 26, 2013

Appcache is a bad thing right now. For instance, if any resources are updated it won't take affect until the next reload.

There needs to be a new revision which is being discussed to fix some issues in it before it will be widely used. appcachefacts.info has numerous details on the current spec, some are fine others are just bad.

At this time, appcache is a bad idea to recommend for people to use in general in my opinion.

Garbee commented Jun 26, 2013

Appcache is a bad thing right now. For instance, if any resources are updated it won't take affect until the next reload.

There needs to be a new revision which is being discussed to fix some issues in it before it will be widely used. appcachefacts.info has numerous details on the current spec, some are fine others are just bad.

At this time, appcache is a bad idea to recommend for people to use in general in my opinion.

tomByrer referenced this pull request in moment/momentjs.com Jul 8, 2013

@FagnerMartinsBrack FagnerMartinsBrack referenced this pull request in zenorocha/browser-diet Jul 25, 2013

Closed

CDN (Content Delivery Network) #197

@kwaledesign kwaledesign referenced this pull request in kwaledesign/generator-archetype-jekyll Sep 24, 2013

Open

Decide what to do with jQuery #31

@zenorocha

This comment has been minimized.

Show comment
Hide comment
@zenorocha

zenorocha Dec 13, 2013

Member

I think front-end dependencies should be managed via Bower, not only jQuery but also Modernizr can be fetched.

Member

zenorocha commented Dec 13, 2013

I think front-end dependencies should be managed via Bower, not only jQuery but also Modernizr can be fetched.

@patrickkettner

This comment has been minimized.

Show comment
Hide comment
@patrickkettner

patrickkettner Dec 13, 2013

Contributor

Modernizr is not planning to go via bower - the variance required for every user makes a build step much more attractive.

Contributor

patrickkettner commented Dec 13, 2013

Modernizr is not planning to go via bower - the variance required for every user makes a build step much more attractive.

@arthurgouveia

This comment has been minimized.

Show comment
Hide comment
@arthurgouveia

arthurgouveia Dec 13, 2013

I was planning to ask that. H5BP uses a customized version of it. I'm not 100% sure about what was changed but I'm guessing there's no way ATM for bower to request it with appropriate flags or any sort of thing to fetch with what H5BP judges as appropriate.

+1 for the jQuery being fetched by bower, tho.

I was planning to ask that. H5BP uses a customized version of it. I'm not 100% sure about what was changed but I'm guessing there's no way ATM for bower to request it with appropriate flags or any sort of thing to fetch with what H5BP judges as appropriate.

+1 for the jQuery being fetched by bower, tho.

@patrickkettner

This comment has been minimized.

Show comment
Hide comment
@patrickkettner

patrickkettner Dec 13, 2013

Contributor

It uses the main modernizr.js file. That is going away in the yet-to-be-release 3.0. It has yet to be determined what is going to happen after that..

If you were to load jQuery via bower, you might as well do the custom build of it. I would be -1 on both requests in favor advocating proper build steps.

Contributor

patrickkettner commented Dec 13, 2013

It uses the main modernizr.js file. That is going away in the yet-to-be-release 3.0. It has yet to be determined what is going to happen after that..

If you were to load jQuery via bower, you might as well do the custom build of it. I would be -1 on both requests in favor advocating proper build steps.

@Ganginator

This comment has been minimized.

Show comment
Hide comment
@Ganginator

Ganginator Dec 13, 2013

Proper build steps are a good thing, as long as there is documentation helping the user walk down the correct path. The worst thing to do is to remove common necessities, and then leave the user lost without any direction.

Proper build steps are a good thing, as long as there is documentation helping the user walk down the correct path. The worst thing to do is to remove common necessities, and then leave the user lost without any direction.

@roblarsen

This comment has been minimized.

Show comment
Hide comment
@roblarsen

roblarsen Dec 13, 2013

Member

Discussion of bower here misses the point of the discussion, I think. The question is "what's the best default way to include jQuery for consumers of HTML5 Boilerplate and their users?" Saying "you should manage your dependencies with Bower" isn't really an answer to that question since it sidesteps the most important part of the equation- what's the fastest default way to serve jQuery to end users.

For now, unless something's changed in the past few months, that remains the Google CDN.

Member

roblarsen commented Dec 13, 2013

Discussion of bower here misses the point of the discussion, I think. The question is "what's the best default way to include jQuery for consumers of HTML5 Boilerplate and their users?" Saying "you should manage your dependencies with Bower" isn't really an answer to that question since it sidesteps the most important part of the equation- what's the fastest default way to serve jQuery to end users.

For now, unless something's changed in the past few months, that remains the Google CDN.

@Ganginator

This comment has been minimized.

Show comment
Hide comment
@Ganginator

Ganginator Dec 13, 2013

I agree with that 100%.
Is there a reason the main http://code.jquery.com/jquery-1.10.2.min.js CDN is not used, versus Google?
Either way, then document the last tested build, and advise the user to consider CDN versus a build?
Using the CDN's to get "out-of-the-box" functionality will make it easier to adapt by new people.

I agree with that 100%.
Is there a reason the main http://code.jquery.com/jquery-1.10.2.min.js CDN is not used, versus Google?
Either way, then document the last tested build, and advise the user to consider CDN versus a build?
Using the CDN's to get "out-of-the-box" functionality will make it easier to adapt by new people.

@roblarsen

This comment has been minimized.

Show comment
Hide comment
@roblarsen

roblarsen Dec 13, 2013

Member

If you're interested in this subject at a "why" level, there's a lot of detailed discussion of the issue in this very thread. Just start at the top and work your way down. The short answer is

  1. The Google CDN has much higher penetration and is, therefore, more likely to serve a cache-lottery winner
  2. The Google CDN supports https:// the jqury cdn does not
Member

roblarsen commented Dec 13, 2013

If you're interested in this subject at a "why" level, there's a lot of detailed discussion of the issue in this very thread. Just start at the top and work your way down. The short answer is

  1. The Google CDN has much higher penetration and is, therefore, more likely to serve a cache-lottery winner
  2. The Google CDN supports https:// the jqury cdn does not
@tomByrer

This comment has been minimized.

Show comment
Hide comment
@tomByrer

tomByrer Dec 24, 2013

+1 for the jQuery being fetched by bower, tho.

Future jQ versions will have its own AMD loader. Though IMHO so many people use various jQurey plugins, I'm not sure if that will save much bandwidth.

Google CDN has much higher penetration and is, therefore, more likely to serve a cache-lottery winner

MAYBE 1.69%
Almost as good reason as your 2. :
3. Google has multiple PoPs in many many countries, including China IIRC.

+1 for the jQuery being fetched by bower, tho.

Future jQ versions will have its own AMD loader. Though IMHO so many people use various jQurey plugins, I'm not sure if that will save much bandwidth.

Google CDN has much higher penetration and is, therefore, more likely to serve a cache-lottery winner

MAYBE 1.69%
Almost as good reason as your 2. :
3. Google has multiple PoPs in many many countries, including China IIRC.

@roblarsen

This comment has been minimized.

Show comment
Hide comment
@roblarsen

roblarsen Dec 24, 2013

Member

The question is a thin one. Why are we serving from Google and not some other CDN?

So, even though the odds aren't great (and pure penetration of any single library isn't the only factor) they're going to be better with Google than with someone else who is less popular. If you read through this thread, you'll see that I think the cache lottery is a canard. It's just that, if the cache lottery matters to you at all, the Google CDN is a much better option than anything else as it's got much higher penetration.

Of course, this is about the best default, not the best possible answer. The best possible answer can only be answered by the individual developer testing options on their individual site or application.

Member

roblarsen commented Dec 24, 2013

The question is a thin one. Why are we serving from Google and not some other CDN?

So, even though the odds aren't great (and pure penetration of any single library isn't the only factor) they're going to be better with Google than with someone else who is less popular. If you read through this thread, you'll see that I think the cache lottery is a canard. It's just that, if the cache lottery matters to you at all, the Google CDN is a much better option than anything else as it's got much higher penetration.

Of course, this is about the best default, not the best possible answer. The best possible answer can only be answered by the individual developer testing options on their individual site or application.

@pankajparashar-zz

This comment has been minimized.

Show comment
Hide comment
@pankajparashar-zz

pankajparashar-zz Dec 24, 2013

I think @roblarsen nailed it with this statement -

Of course, this is about the best default, not the best possible answer.

+1

I think @roblarsen nailed it with this statement -

Of course, this is about the best default, not the best possible answer.

+1

@tomByrer

This comment has been minimized.

Show comment
Hide comment
@tomByrer

tomByrer Dec 24, 2013

The best possible answer can only be answered by the individual developer testing options on their individual site or application.

Yes, which is why I said "Test test test" before. And why we need to be aware of pros & cons.
As far as 'cache lottery', the technical reasons (number of PoPs, high uptime) are more likely going to weigh heavier. It could be worth looking into if a large percentage of your site's traffic is refereed from a single particular site, then share the same CDN & version, if that does not break your other code.

The best possible answer can only be answered by the individual developer testing options on their individual site or application.

Yes, which is why I said "Test test test" before. And why we need to be aware of pros & cons.
As far as 'cache lottery', the technical reasons (number of PoPs, high uptime) are more likely going to weigh heavier. It could be worth looking into if a large percentage of your site's traffic is refereed from a single particular site, then share the same CDN & version, if that does not break your other code.

@roblarsen

This comment has been minimized.

Show comment
Hide comment
@roblarsen

roblarsen Dec 24, 2013

Member

Pinterest would be a great candidate for that technique

Member

roblarsen commented Dec 24, 2013

Pinterest would be a great candidate for that technique

@tomByrer

This comment has been minimized.

Show comment
Hide comment
@tomByrer

tomByrer Dec 24, 2013

Pinterest would be a great candidate for that technique

Unfortunately, they don't seem to use jQuery.js at all, perhaps concocted. I was thinking more stackoverflow/StackExchange, who uses //ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js.
But then, one has to test their site if they change their jQuery version... might not be worth it.

Pinterest would be a great candidate for that technique

Unfortunately, they don't seem to use jQuery.js at all, perhaps concocted. I was thinking more stackoverflow/StackExchange, who uses //ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js.
But then, one has to test their site if they change their jQuery version... might not be worth it.

@roblarsen

This comment has been minimized.

Show comment
Hide comment
@roblarsen

roblarsen Dec 24, 2013

Member

oh they used to. they did when we had this discussion originally.

Member

roblarsen commented Dec 24, 2013

oh they used to. they did when we had this discussion originally.

@robwierzbowski

This comment has been minimized.

Show comment
Hide comment
@robwierzbowski

robwierzbowski Dec 30, 2013

Contributor

...Most H5BP projects use jQuery via Google CDN. (#oroboros #selfFulfillingProphecy)

Contributor

robwierzbowski commented Dec 30, 2013

...Most H5BP projects use jQuery via Google CDN. (#oroboros #selfFulfillingProphecy)

@ghost ghost assigned alrra Jan 13, 2014

@alrra

This comment has been minimized.

Show comment
Hide comment
@alrra

alrra Jan 19, 2014

Member

build step

H5BP tries to be relatively agnostic of any high level development philosophy or framework, so that it can be used:

  • as a base on which other projects can be build upon
    (projects that can ultimately provide the build step)
  • even by less experienced developers
    (developers that don't use a build step or even know what that is)

Thank you all for your comments!

I'm sure there will be a similar discussion in the near future (especially as things such as SPDY / HTTP 2.0 get more traction), but for the time being, we'll stick with the CDN.

Member

alrra commented Jan 19, 2014

build step

H5BP tries to be relatively agnostic of any high level development philosophy or framework, so that it can be used:

  • as a base on which other projects can be build upon
    (projects that can ultimately provide the build step)
  • even by less experienced developers
    (developers that don't use a build step or even know what that is)

Thank you all for your comments!

I'm sure there will be a similar discussion in the near future (especially as things such as SPDY / HTTP 2.0 get more traction), but for the time being, we'll stick with the CDN.

@alrra alrra closed this Jan 19, 2014

@roblarsen roblarsen referenced this pull request Jan 23, 2014

Closed

Update html.md #1498

roblarsen added a commit to roblarsen/html5-boilerplate that referenced this pull request Feb 24, 2014

Extended comments about use of jQuery CDN
As promised in #1498
Encapsulates the discussion from #1327

roblarsen added a commit to roblarsen/html5-boilerplate that referenced this pull request Feb 24, 2014

Extended comments about use of jQuery CDN
As promised in #1498
Encapsulates the discussion from #1327

roblarsen added a commit to roblarsen/html5-boilerplate that referenced this pull request Feb 24, 2014

Extended comments about use of jQuery CDN
As promised in #1498
Encapsulates the discussion from #1327
@alrra

This comment has been minimized.

Show comment
Hide comment
@tomByrer

This comment has been minimized.

Show comment
Hide comment
@tomByrer

tomByrer Apr 9, 2014

Not really helpful article:

The root problem is that the last-mile latency of mobile carriers is atrocious - that's what we need to fix.

Like others & I said before; test if you want to tweak. The reason for
this thread is if the default (Google CDN) is sufficient, which it is.
I'm an optimization geek (which is why I help jsDelivr), but I don't think
worrying about POPs in the same rack as the mobile POPs really in the scope
of h5bp.

On Wed, Apr 9, 2014 at 11:49 AM, Cătălin Mariș notifications@github.comwrote:

@igrigorik https://github.com/igrigorik - Why is my CDN 'slow' for
mobile clients?http://www.igvita.com/2014/03/26/why-is-my-cdn-slow-for-mobile-clients/


Reply to this email directly or view it on GitHubhttps://github.com/h5bp/html5-boilerplate/pull/1327#issuecomment-39994615
.

tomByrer commented Apr 9, 2014

Not really helpful article:

The root problem is that the last-mile latency of mobile carriers is atrocious - that's what we need to fix.

Like others & I said before; test if you want to tweak. The reason for
this thread is if the default (Google CDN) is sufficient, which it is.
I'm an optimization geek (which is why I help jsDelivr), but I don't think
worrying about POPs in the same rack as the mobile POPs really in the scope
of h5bp.

On Wed, Apr 9, 2014 at 11:49 AM, Cătălin Mariș notifications@github.comwrote:

@igrigorik https://github.com/igrigorik - Why is my CDN 'slow' for
mobile clients?http://www.igvita.com/2014/03/26/why-is-my-cdn-slow-for-mobile-clients/


Reply to this email directly or view it on GitHubhttps://github.com/h5bp/html5-boilerplate/pull/1327#issuecomment-39994615
.

@robwierzbowski

This comment has been minimized.

Show comment
Hide comment
@robwierzbowski

robwierzbowski Apr 9, 2014

Contributor

Just to re-iterate: if all h5bp projects use google cdn for jQuery, and if
many sites use h5bp, we'll at least be making each other faster.

On Wednesday, April 9, 2014, tomByrer notifications@github.com wrote:

Not really helpful article:

The root problem is that the last-mile latency of mobile carriers is
atrocious - that's what we need to fix.

Like others & I said before; test if you want to tweak. The reason for
this thread is if the default (Google CDN) is sufficient, which it is.
I'm an optimization geek (which is why I help jsDelivr), but I don't think
worrying about POPs in the same rack as the mobile POPs really in the
scope
of h5bp.

On Wed, Apr 9, 2014 at 11:49 AM, Cătălin Mariș <notifications@github.comjavascript:_e(%7B%7D,'cvml','notifications@github.com');>wrote:

@igrigorik https://github.com/igrigorik - Why is my CDN 'slow' for
mobile clients?<
http://www.igvita.com/2014/03/26/why-is-my-cdn-slow-for-mobile-clients/>


Reply to this email directly or view it on GitHub<
https://github.com/h5bp/html5-boilerplate/pull/1327#issuecomment-39994615>

.


Reply to this email directly or view it on GitHubhttps://github.com/h5bp/html5-boilerplate/pull/1327#issuecomment-40003579
.

Rob Wierzbowski
@robwierzbowski http://twitter.com/#!/robwierzbowski
http://github.com/robwierzbowski
http://robwierzbowski.com

Contributor

robwierzbowski commented Apr 9, 2014

Just to re-iterate: if all h5bp projects use google cdn for jQuery, and if
many sites use h5bp, we'll at least be making each other faster.

On Wednesday, April 9, 2014, tomByrer notifications@github.com wrote:

Not really helpful article:

The root problem is that the last-mile latency of mobile carriers is
atrocious - that's what we need to fix.

Like others & I said before; test if you want to tweak. The reason for
this thread is if the default (Google CDN) is sufficient, which it is.
I'm an optimization geek (which is why I help jsDelivr), but I don't think
worrying about POPs in the same rack as the mobile POPs really in the
scope
of h5bp.

On Wed, Apr 9, 2014 at 11:49 AM, Cătălin Mariș <notifications@github.comjavascript:_e(%7B%7D,'cvml','notifications@github.com');>wrote:

@igrigorik https://github.com/igrigorik - Why is my CDN 'slow' for
mobile clients?<
http://www.igvita.com/2014/03/26/why-is-my-cdn-slow-for-mobile-clients/>


Reply to this email directly or view it on GitHub<
https://github.com/h5bp/html5-boilerplate/pull/1327#issuecomment-39994615>

.


Reply to this email directly or view it on GitHubhttps://github.com/h5bp/html5-boilerplate/pull/1327#issuecomment-40003579
.

Rob Wierzbowski
@robwierzbowski http://twitter.com/#!/robwierzbowski
http://github.com/robwierzbowski
http://robwierzbowski.com

@tomByrer

This comment has been minimized.

Show comment
Hide comment
@tomByrer

tomByrer Apr 9, 2014

@robwierzbowski You must have missed this post; expecting hitting the cache with same version is dismal, esp now 10+ releases later...?
#1327 (comment)

tomByrer commented Apr 9, 2014

@robwierzbowski You must have missed this post; expecting hitting the cache with same version is dismal, esp now 10+ releases later...?
#1327 (comment)

tomByrer added a commit to tomByrer/html5-boilerplate that referenced this pull request Apr 9, 2014

removed inconclusive proof for Google CDN caching
Perhaps the rise in [AngularJS](http://angularjs.org/)'s popularity be the real cause for the rise in popularity in ajax.googleapis.com?  Or maybe "[Sites using Google Libraries API](http://httparchive.org/trends.php#perGlibs)" rise is caused by "[Sites with Custom Fonts](http://httparchive.org/trends.php#perFonts)"?  Perhaps a revisit to see if [hitting the version lottery](h5bp#1327 (comment)) is in order, but I doubt it will prove it is worth mentioning "increases the odds of having a copy of the library in your user's browser cache".

Even though I'm involved with [jsDelivr CDN](http://www.jsdelivr.com/), and I see alot of [cdnjs](http://cdnjs.com/) usage in the wild, I conclude Google's CDN is the best to use for h5bp.  But please, an extra 1% chance of hitting cache isn't really worth it IMHO.

@tomByrer tomByrer referenced this pull request in roblarsen/html5-boilerplate Apr 9, 2014

Closed

removed inconclusive proof for Google CDN caching #1

@robwierzbowski

This comment has been minimized.

Show comment
Hide comment
@robwierzbowski

robwierzbowski Apr 9, 2014

Contributor

I have seen that and I really appreciate the research. I wonder if there is a way to track a segment of users and see how many times over the course of the day a they use the same version of jQuery between sites. It's possible user behavior could select for sites that have a smaller subset of jQuery versions (I'm a designer and I visit a lot of elitist web design blogs?), and I personally work on a small ecosystem of apps that can benefit from pulling the same version of jQuery between sites. And then there's cache lifetime: the longer a user's cache lives, the more likely they are to have a particular jQuery in their cache. How long does an average internet user need to surf until they have 80% of jQuery versions somewhere in their cache? The likeliness of a particular version being available is a curve that increases over time.

Thanks for pointing out the numbers again, and I completely agree that it's a lottery. But I'm not convinced that those numbers alone settle the "should we use a CDN" question.

Contributor

robwierzbowski commented Apr 9, 2014

I have seen that and I really appreciate the research. I wonder if there is a way to track a segment of users and see how many times over the course of the day a they use the same version of jQuery between sites. It's possible user behavior could select for sites that have a smaller subset of jQuery versions (I'm a designer and I visit a lot of elitist web design blogs?), and I personally work on a small ecosystem of apps that can benefit from pulling the same version of jQuery between sites. And then there's cache lifetime: the longer a user's cache lives, the more likely they are to have a particular jQuery in their cache. How long does an average internet user need to surf until they have 80% of jQuery versions somewhere in their cache? The likeliness of a particular version being available is a curve that increases over time.

Thanks for pointing out the numbers again, and I completely agree that it's a lottery. But I'm not convinced that those numbers alone settle the "should we use a CDN" question.

@tomByrer

This comment has been minimized.

Show comment
Hide comment
@tomByrer

tomByrer Apr 9, 2014

Seems with HTTP1.1, CDNs allow more "Max Connections" than local host alone (without sharding local, which is beyond scope of h5bp, & IMHO won't see the benefit of a separate server.).

The likeliness of a particular version being available is a curve that increases over time.

Perhaps 2 years ago, but devs don't upgrade versions much, so there is a huge range of versions out there. Plus IMHO script collation & there being 4-5 CDNs in use will reduce cache odds a bit more.

possible user behavior could select for sites that have a smaller subset of jQuery versions

I used to worry about that, until Pintrest dropped using Google's CDN (they now collate & CDN themselves). If you are the web dev for a network of sites that link together perhaps that can help. But if you're that big, you should really look into a caching CDN (eg Envato/Tuts+ uses CloudFlare).
Again, out of scope for this project.


Also, seems there is a bit of backlash against extra DNS lookups, but:

  • Thanks to other libs & versions using ajax.googleapis.com DNS entry is likely cached already
  • jQuery is a big lib; likely worth a DNS lookup.

tomByrer commented Apr 9, 2014

Seems with HTTP1.1, CDNs allow more "Max Connections" than local host alone (without sharding local, which is beyond scope of h5bp, & IMHO won't see the benefit of a separate server.).

The likeliness of a particular version being available is a curve that increases over time.

Perhaps 2 years ago, but devs don't upgrade versions much, so there is a huge range of versions out there. Plus IMHO script collation & there being 4-5 CDNs in use will reduce cache odds a bit more.

possible user behavior could select for sites that have a smaller subset of jQuery versions

I used to worry about that, until Pintrest dropped using Google's CDN (they now collate & CDN themselves). If you are the web dev for a network of sites that link together perhaps that can help. But if you're that big, you should really look into a caching CDN (eg Envato/Tuts+ uses CloudFlare).
Again, out of scope for this project.


Also, seems there is a bit of backlash against extra DNS lookups, but:

  • Thanks to other libs & versions using ajax.googleapis.com DNS entry is likely cached already
  • jQuery is a big lib; likely worth a DNS lookup.

dwick added a commit to reddit-archive/reddit that referenced this pull request Oct 20, 2014

Add jquery/html5shim directly to bundle.
Also removes "load core JS libraries from reddit servers" as
a preference since it is no longer needed.

Using the google cdn for jquery adds very little benefit, see:
http://statichtml.com/2011/google-ajax-libraries-caching.html
and
h5bp/html5-boilerplate#1327

@beneverard beneverard referenced this pull request in theideabureau/bureau-style-guide Jul 30, 2015

Closed

Investigate using local jQuery instead of Google CDN #27

@tomByrer tomByrer referenced this pull request in jsdelivr/jsdelivr Nov 17, 2015

Closed

Use Zopfli for compression #7928

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment