Skip to content
This repository has been archived by the owner on Jul 24, 2020. It is now read-only.

Speed-testing #574

Merged
merged 4 commits into from
Jul 8, 2014
Merged

Speed-testing #574

merged 4 commits into from
Jul 8, 2014

Conversation

squidgetx
Copy link
Contributor

We should do some serious performance profiling. While I don't have the evidence on hand, I would bet one of my kidneys that we have massive N+1 problems. (Actually, now that I think back on it, I'm pretty sure I'm responsible for at least some of the N+1 code...) So, at some juncture, we should:

  • use Rack Mini Profiler and YSlow to identify the source of slowness;
  • use Bullet to reduce the number of N+1 queries;
  • perform other optimizations that are, at this point, definitely not premature anymore.

Speed has, so far, been the chief complaint of almost every MT I talked to, so there's some case to make this top priority -- but it should probably wait for the next milestone, i.e. after we finalize the full testing coverage.

@shippy shippy mentioned this pull request Jun 26, 2014
@squidgetx squidgetx added this to the 3.4.0 milestone Jun 27, 2014
@mnquintana
Copy link
Contributor

In the spirit of automating stuff, do you think it might be worthwhile to write some speed tests (as in, not benchmarks, but adding to our test suite)? ie. a test should fail if the catalog takes longer than 5 secs to render, or something like that?

@squidgetx
Copy link
Contributor

Let's just use this PR to get the profiling gems into our workflow. Testing speed sounds like a good idea but I think it would depend on a) ability to write integration tests and b) the power of the working machine

@dgoerger
Copy link
Contributor

dgoerger commented Jul 8, 2014

I believe the usual method for benchmarking speed is to always use the same machine for the tests. My setup is ridiculously fast. The production server at Science Park is by comparison absurdly slow. @squidgetx's machine is in-between. We would need to pick a consistent environment (hardware, software environment) for benchmarks to be remotely meaningful between version numbers, or between benchmarks of the same point-release even.

@mnquintana
Copy link
Contributor

That makes a lot of sense. Maybe it would be valuable to create an environment mirroring (as best we can) the resources of the Science Park VM our production instances are running on?

@dgoerger
Copy link
Contributor

dgoerger commented Jul 8, 2014

I'll ask Mike about the VM specs when i call re installing Ruby 2.1.1 on the server for v3.3.x. We can probably emulate it with VirtualBox, if not import a copy of the VM directly, and store it in the STC Box folder or.. a Dev Box folder since we're leaving STC.

squidgetx added a commit that referenced this pull request Jul 8, 2014
@squidgetx squidgetx merged commit cbdc7ce into development Jul 8, 2014
@orenyk
Copy link
Contributor

orenyk commented Jul 8, 2014

That makes sense to me. We have the development instance, although we're already using it for user testing / feedback purposes. Can we get another instance up for benchmarking purposes?

@dgoerger don't we also need to be on the same hardware to appropriately test speed?

@squidgetx squidgetx deleted the 574_speed_testing branch July 11, 2014 15:54
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants