Well, by now, you've done all your arguing on Twitter. Is Turbolinks a good idea, or the Worst Thing Ever?
optimizing a bit early
See these guys? One of them said this:
"We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil"
But what makes optimization premature? When you don't know if you should do it or not. How do you know?
measure twice, cut once
Measuring. It's good for you. You can do it. If you measure things, you can be sure what's up.
But like eating your veggies, nobody measures. Ever.
you're a scientist, dammit
Computer SCIENCE is called science for a reason, yo. Be a scientist. Don't just argue about stuff on blogs. Measure things. Then report back.
this test sucks
This probably isn't even a good test. I don't care. Tell me how it sucks. Let's figure it out. But having actual measurements beats complaining about shit on Twitter any day.
just css branch
This test adds Basecamp Next's JS file.
To run it:
$ bundle $ rspec
What I get:
With 1000 pages:
$ rspec user system total real no turbolinks 15.300000 1.700000 17.270000 (433.880904) yes turbolinks 10.540000 0.890000 11.430000 (170.545663)
With 100 pages:
$ rspec user system total real no turbolinks 1.710000 0.210000 2.190000 ( 47.051954) yes turbolinks 1.100000 0.090000 1.190000 ( 16.778509)