Permalink
Browse files

Issue #460 -- removing the optional bang from hashStrip ... too many …

…inconsistencies.
  • Loading branch information...
1 parent fcbedad commit 10230e4d76e21f08a1dee1fe5d28994e2cf5f11d @jashkenas committed Jul 5, 2011
Showing with 1 addition and 1 deletion.
  1. +1 −1 backbone.js
View
@@ -738,7 +738,7 @@
};
// Cached regex for cleaning hashes.
- var hashStrip = /^#*!?/;
+ var hashStrip = /^#*/;
// Cached regex for detecting MSIE.
var isExplorer = /msie [\w.]+/;

9 comments on commit 10230e4

Wondering if you can elaborate on "too many incosistancies?"

Owner

jashkenas replied Jul 14, 2011

Hmm, wish I had commented back when I remembered ... Just in general, it makes loading and saving URLs inconsistent. It's backwards incompatible with pre 0.5.0 that uses hashbangs. And you really shouldn't ever be using hashbangs anyway.

And you really shouldn't ever be using hashbangs anyway.

Sorry to be a pest, but could you point me towards why. I am working on a project and I've never used #!'s but doesn't Google suggest using them for SEO? I know HTML5 has pushState and what-not but this app has to work in IE7. Ugh. Thanks for any pointers.

Owner

jashkenas replied Jul 14, 2011

Google used to suggest using them for SEO, but you still have to implement _escaped_fragment_ on the server-side, which pretty much no one actually did.

Now that we have pushState, you can use that to provide true search-index-able URLs for modern browsers, and for the Googlebot, while IE gets hash-based URLs (without the cargo-culted !) ... which is fine, because IE7 is not a search engine.

Since Googlebot doesn't run JavaScript, isn't #! still needed even with push state if you do any client side rendering in JavaScript? In that case I believe you really do need #! to tell Googlebot to use the escaped_fragment syntax to get a pre-rendered version of the page (via PhantomJS or such on the server). Would love to see the #! support come back to support this. Or maybe at least a way to optionally enable it?

Owner

jashkenas replied Aug 15, 2014

That's incorrect. You can use real URLs, pushState, and pre-rendered versions of the page to give the GoogleBot content — and your users as well. Also, the Googlebot does run JavaScript (some of it). Finally, you can always put ! in your routes if you really want to. But you shouldn't want to.

True, you cold pre-render pages for both - in my case I'm not pre-rendering for users though since it's a data driven SPA, and I'm only re-rendering parts of the page. Adding ! to routes is a reasonable solution though, already did that in fact :-) Thanks

Owner

jashkenas replied Aug 15, 2014

Is your server actually currently serving _escaped_fragment_ pages?

Right, it uses PhantomJS to load the page, let the JavaScript run for a bit, then return the plain html to Googlebot

Please sign in to comment.