Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP

Loading…

TestSwarm integration / add a way to run many benchmarks #54

Closed
sindresorhus opened this Issue · 10 comments

3 participants

@sindresorhus

Would be awesome if jsperf could somehow use TestSwarm to gather performance results. I could just leave the browser open and automagically provide people with performance results just like I do with the jQuery test suite. This would make jsperf even more useful, since you would get a lot more performance results.

@jdalton
Collaborator

Maybe a checkbox to "review for TestSwarm".
I wouldn't want every test as some are poor or try to freeze the browser or contain errors.

@sindresorhus

@jdalton Good point. Maybe some kind of voting process to accept perfs in review? This sounds like a big job doing it manually.

@jdalton
Collaborator

Or maybe if you already have a TestSwarm account/credentials we could find a way to hook into that. That way it's shifts the responsibility to the dev.

@sindresorhus

How would that help? I have a TestSwarm account, but I could still submit an awful test.

@jdalton
Collaborator

Ya but then it's you effing up TestSwarm (which you could do w/o jsPerf) and not jsPerf effing up TestSwarm.
So it shifts the accountability to the dev and not jsPerf ;D

@sindresorhus

Ah, right. But the question is; would devs bother to register on TestSwarm?

Alternatively, jsperf could host it's own version of TestSwarm, so not to eff up other projects use of TestSwarm. It's open source after all.

@jdalton
Collaborator

PerfSwarm FTW!

@mathiasbynens

We don’t really need TestSwarm; we already use Browserscope to store the results.

We could code up a fairly simple page that contains a JavaScript array of tests to run. Then, we’d run the first test in an iframe, and when it’s finished and the results have been submitted to Browserscope (at worst we could use an arbitrary timeout for this) we move on to the next test, and so on.

There are some problems with this idea:

  • It has the potential to use up a lot of bandwidth.
  • I’m afraid people will open all their browsers, point them to this PerfSwarm page, and leave them running the tests. This is not a good idea since tests that are running in another browser may lock up the CPU and influence the results of other tests. Modern browsers clamp setTimeout and similar to ~1000ms for inactive windows/tabs, which may also influence the results a bit (even though Benchmark.js attempts to work around it).

I’d be happy to implement this if we can find a good solution for these issues.

This is a duplicate of #26. Closing #26 as this ticket has more interesting comments.

@sindresorhus

We don’t really need TestSwarm; we already use Browserscope to store the results.

Good point. Though the reason using TestSwarm is a good idea, is that there is already a lot of users contributing browser time. Though there is a lot of jsperf users too, so I don't think it's a big problem to get people to donate some unused browser time.

It has the potential to use up a lot of bandwidth.

How is the current bandwidth usage? Are you paying for the server yourself?
Get a sponsor?

I’m afraid people will open all their browsers, point them to this PerfSwarm page, and leave them running the tests. This is not a good idea since tests that are running in another browser may lock up the CPU and influence the results of other tests.

Maybe check the IP and user-agent and make sure only one test i running per IP simultaneously, and if there are multiple browsers from the same IP, then rotate between each browser?

Modern browsers clamp setTimeout and similar to ~1000ms for inactive windows/tabs, which may also influence the results a bit (even though Benchmark.js attempts to work around it).

Page Visibility FTW!

@jdalton
Collaborator

I think we are going to punt on relying on additional 3rd parties.

@jdalton jdalton closed this
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.