Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Run the page load performance test in CI #12988

Closed
shinglyu opened this issue Aug 23, 2016 · 10 comments · Fixed by #19323
Closed

Run the page load performance test in CI #12988

shinglyu opened this issue Aug 23, 2016 · 10 comments · Fixed by #19323

Comments

@shinglyu
Copy link
Contributor

@larsbergstrom @edunham @aneeshusa

My server's harddrive corrupted this weekend, so I feel we should start running etc/ci/performance in the CI rather then my computer.

Here are some concerns and I would like to hear your opinions:

  1. The performance test needs a dedicated environment, if other tests are running at the same time, the performance measurement won't be accurate
  2. We probably don't need to run performance test on every commit, a nightly build will be sufficient.
  3. We also need to run Firefox for comparison, this might require a one-time manual setup
  4. Is it possible for people to submit performance test request to our CI, and allow them to compare using Perfherder's compare view? I think I can handle the treeherder API, but can our CI handle the workload?
@shinglyu
Copy link
Contributor Author

FYI, it only runs on Linux right now.

@paulrouget
Copy link
Contributor

FYI, it only runs on Linux right now.

On Mac, GNU date and Python 3 are required. And then some python code fails for some reason.

@highfive
Copy link

cc @aneeshusa

@metajack
Copy link
Contributor

This sounds perfect for a nightly job that spins up a dedicted EC2 instance and shuts down afterward. If the test takes less than an hour, this will cost us like $1-2/day and should have predictable performance.

@shinglyu
Copy link
Contributor Author

Looks like the test has been busted on my server for a few day. And since the ./mach test-perf command is ready, I might take some time to move that to CI.

@edunham
Copy link
Contributor

edunham commented Oct 13, 2016

  1. The performance test needs a dedicated environment, if other tests are running at the same time, the performance measurement won't be accurate

My opinion is that maximum consistency between builds would be achieved by a box sitting under someone's desk in a Moz office. I wouldn't mind sticking a desktop on my desk in PDX, as long as the occasional downtime is tolerable (which it should be for this use case). A dedicated AWS instance might be just as good; but I'd like to verify that before committing to it (see below)

  1. We probably don't need to run performance test on every commit, a nightly build will be sufficient.

SGTM; the backend behind perf.rust-lang.org builds results every 4hrs or so on a cron job and seems to serve its purpose fine.

  1. We also need to run Firefox for comparison, this might require a one-time manual setup

As long as we document the manual setup sufficiently that people without permissions on our systems could duplicate it if they want to, that's fine by me. If the setup is super fussy, I'll need either direct assistance or good docs to get it sorted out in a timely manner.

  1. Is it possible for people to submit performance test request to our CI, and allow them to compare using Perfherder's compare view? I think I can handle the treeherder API, but can our CI handle the workload?

We could abuse Homu's queueing and prioritizing skills for this, by PR-ing the changes they want tested to a special repo... as long as we serialize the jobs on a single host, any congestion will hurt the users instead of hurting test accuracy.

This sounds perfect for a nightly job that spins up a dedicted EC2 instance and shuts down afterward. If the test takes less than an hour, this will cost us like $1-2/day and should have predictable performance.

@metajack, do you know of any sources about build time reproducibility on dedicated AWS instances? If we go this route, I'd like to run the exact same build over and over on the dedicated instance for a day or two, then check that the times were within an acceptable range of each other.

@shinglyu
Copy link
Contributor Author

shinglyu commented Nov 9, 2016

With #14147 I'm close to fixing this, but Gecko testing is not working, I'll file a separate bug for that.

@edunham
Copy link
Contributor

edunham commented Jul 21, 2017

@shinglyu when you say "close to fixing this" do you mean that this entire issue will be resolved once #13430 is completed?

@shinglyu
Copy link
Contributor Author

I don't think people are focusing on this right now. Enabling this will require extra man power to monitor and maintain the CI (and AWS budget), so I'd suggest we delay this until we really need to focus on performance tuning for servo itself.

@nox
Copy link
Contributor

nox commented Oct 4, 2017

How far are we from getting this?

bors-servo pushed a commit that referenced this issue Nov 27, 2017
Run test-perf on linux-nightly.

<!-- Please describe your changes on the following line: -->

Run test-perf on linux-nightly.

---
<!-- Thank you for contributing to Servo! Please replace each `[ ]` by `[X]` when the step is complete, and replace `__` with appropriate data: -->
- [X] `./mach build -d` does not report any errors
- [X] `./mach test-tidy` does not report any errors
- [X] These changes fix #12988
- [X] These changes do not require tests because this is test infrastructure

<!-- Also, please make sure that "Allow edits from maintainers" checkbox is checked, so that we can help you if you get stuck somewhere along the way.-->

<!-- Pull requests that do not address these steps are welcome, but they will require additional verification as part of the review process. -->

<!-- Reviewable:start -->
---
This change is [<img src="https://reviewable.io/review_button.svg" height="34" align="absmiddle" alt="Reviewable"/>](https://reviewable.io/reviews/servo/servo/19323)
<!-- Reviewable:end -->
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants