This repository has been archived by the owner. It is now read-only.

Get rid of get_mozharness job #745

Closed
whimboo opened this Issue Feb 9, 2016 · 16 comments

Comments

Projects
None yet
3 participants
@whimboo
Contributor

whimboo commented Feb 9, 2016

The past has been shown that we might not want to have this job given the constant updates of the mozharness revision. When we run our jobs in TaskCluster it will even always use the latest mozharness version of the given branch. So that's even something we do not do. With the removal of the mozharness job and making the necessary calls in each job instead we could benefit from the following:

  • no more manual updates of the mozharness revision - we are always using the latest version
  • Jobs can use mozharness code from the appropriate branch and not only from mozilla-central

@mjzffr and @sydvicious what do you think?

@mjzffr

This comment has been minimized.

Show comment
Hide comment
@mjzffr

mjzffr Feb 9, 2016

I agree. Respecting branches is good. Matching the behaviour in buildbot/taskcluster is good.

mjzffr commented Feb 9, 2016

I agree. Respecting branches is good. Matching the behaviour in buildbot/taskcluster is good.

@sydvicious

This comment has been minimized.

Show comment
Hide comment
@sydvicious

sydvicious Feb 9, 2016

Contributor

The revision used for mozharness needs to match the revision used for Firefox and the revision for the tests.zip package. On pf-jenkins, I added another job to get mozharness for each branch, and then set up a dependency chain with the job that downloads firefox, the job that downloads tests.zip file, the job that downloads mozharness, and the jobs that run the tests.

Contributor

sydvicious commented Feb 9, 2016

The revision used for mozharness needs to match the revision used for Firefox and the revision for the tests.zip package. On pf-jenkins, I added another job to get mozharness for each branch, and then set up a dependency chain with the job that downloads firefox, the job that downloads tests.zip file, the job that downloads mozharness, and the jobs that run the tests.

@whimboo

This comment has been minimized.

Show comment
Hide comment
@whimboo

whimboo Feb 10, 2016

Contributor

This is not something we have to worry about. The job in mozmill-ci will always have a branch and revision parameter. So that one would be used to determine the version of mozharness. There might be a problem with Tinderbox builds (revisions) which archiver might not support but the package lays beside the build in the same folder:

http://ftp.mozilla.org/pub/firefox/tinderbox-builds/mozilla-central-linux64/1454814652/mozharness.zip

Maybe it would be good to check how Taskcluster and Buildbot are handling this before changing anything in mozmill-ci.

Contributor

whimboo commented Feb 10, 2016

This is not something we have to worry about. The job in mozmill-ci will always have a branch and revision parameter. So that one would be used to determine the version of mozharness. There might be a problem with Tinderbox builds (revisions) which archiver might not support but the package lays beside the build in the same folder:

http://ftp.mozilla.org/pub/firefox/tinderbox-builds/mozilla-central-linux64/1454814652/mozharness.zip

Maybe it would be good to check how Taskcluster and Buildbot are handling this before changing anything in mozmill-ci.

@sydvicious

This comment has been minimized.

Show comment
Hide comment
@sydvicious

sydvicious Feb 10, 2016

Contributor

If you have the revision, you can always just use hg to checkout the mozmill directory.

Contributor

sydvicious commented Feb 10, 2016

If you have the revision, you can always just use hg to checkout the mozmill directory.

@whimboo

This comment has been minimized.

Show comment
Hide comment
@whimboo

whimboo Feb 10, 2016

Contributor

No, we don't want to get mozmill! ;) Anyway, there is no way for a partial checkout of the appropriate branch via hg. We might want to enhance the archiver to also make packages available for tinderbox builds.

Contributor

whimboo commented Feb 10, 2016

No, we don't want to get mozmill! ;) Anyway, there is no way for a partial checkout of the appropriate branch via hg. We might want to enhance the archiver to also make packages available for tinderbox builds.

@mjzffr

This comment has been minimized.

Show comment
Hide comment
@mjzffr

mjzffr Feb 10, 2016

Re taskcluster: my understanding is that after the source tree is checked out on the builder, mozharness gets packaged (https://dxr.mozilla.org/mozilla-central/source/toolkit/mozapps/installer/packager.mk#59) and made available as a build-task artifact "mozharness.zip" (https://dxr.mozilla.org/mozilla-central/source/testing/taskcluster/scripts/builder/build-mulet-linux.sh#39). Then test tasks can specify where to obtain the mozharness package from the build-task-image via MOZHARNESS_URL. (https://dxr.mozilla.org/mozilla-central/source/testing/taskcluster/tasks/test.yml#22)

mjzffr commented Feb 10, 2016

Re taskcluster: my understanding is that after the source tree is checked out on the builder, mozharness gets packaged (https://dxr.mozilla.org/mozilla-central/source/toolkit/mozapps/installer/packager.mk#59) and made available as a build-task artifact "mozharness.zip" (https://dxr.mozilla.org/mozilla-central/source/testing/taskcluster/scripts/builder/build-mulet-linux.sh#39). Then test tasks can specify where to obtain the mozharness package from the build-task-image via MOZHARNESS_URL. (https://dxr.mozilla.org/mozilla-central/source/testing/taskcluster/tasks/test.yml#22)

@whimboo

This comment has been minimized.

Show comment
Hide comment
@whimboo

whimboo Feb 10, 2016

Contributor

Ok, so for mozmill-ci this might be a missing feature of archiver then. Thanks Maja! I will have a look at this in the next weeks. Whereby I might not want to do it before all of our jobs make use of test-packages.

Contributor

whimboo commented Feb 10, 2016

Ok, so for mozmill-ci this might be a missing feature of archiver then. Thanks Maja! I will have a look at this in the next weeks. Whereby I might not want to do it before all of our jobs make use of test-packages.

@whimboo

This comment has been minimized.

Show comment
Hide comment
@whimboo

whimboo Mar 14, 2016

Contributor

We had total breakage of our tests again the last days because of changes I made on mozilla-central. I did not bump the revision of mozharness in mozmill-ci. So no tests have been found for our tests against Nightly builds.

On the other side I'm happy that I did not push the mh revision update because those changes are backward incompatible and would have caused bustage on all other branches as well. That means we definitely need a branch specific mozharness checkout!

Contributor

whimboo commented Mar 14, 2016

We had total breakage of our tests again the last days because of changes I made on mozilla-central. I did not bump the revision of mozharness in mozmill-ci. So no tests have been found for our tests against Nightly builds.

On the other side I'm happy that I did not push the mh revision update because those changes are backward incompatible and would have caused bustage on all other branches as well. That means we definitely need a branch specific mozharness checkout!

@whimboo whimboo added the must-have label Mar 14, 2016

@whimboo whimboo self-assigned this Mar 14, 2016

@whimboo

This comment has been minimized.

Show comment
Hide comment
@whimboo

whimboo Mar 14, 2016

Contributor

So we cannot have the fetch of the archiver client as part of each individual job, given that on Windows wget is not known. On the other side we also do not have to download the archiver client each time given that it changes that rarely. So I would suggest we make it a step in the scripts job, which will update the archiver when run. The individual test jobs can then simply call the archiver Python script which will fetch the corresponding mozharness archive.

It would be nice if there is a way to cache artifacts for a parametrized job and other jobs are using those artifacts until one value changes. So we would only have to download the mozharness archive once for builds of the same revision and repository. Sadly I don't find a way for that, so I will implement the first part of this comment.

Contributor

whimboo commented Mar 14, 2016

So we cannot have the fetch of the archiver client as part of each individual job, given that on Windows wget is not known. On the other side we also do not have to download the archiver client each time given that it changes that rarely. So I would suggest we make it a step in the scripts job, which will update the archiver when run. The individual test jobs can then simply call the archiver Python script which will fetch the corresponding mozharness archive.

It would be nice if there is a way to cache artifacts for a parametrized job and other jobs are using those artifacts until one value changes. So we would only have to download the mozharness archive once for builds of the same revision and repository. Sadly I don't find a way for that, so I will implement the first part of this comment.

@sydvicious

This comment has been minimized.

Show comment
Hide comment
@sydvicious

sydvicious Mar 14, 2016

Contributor

Couldn't we either make get_mozharness parameterized, or have a get_mozharness job for nightly, aurora, etc? That's what I would do in pf-jenkins.

Contributor

sydvicious commented Mar 14, 2016

Couldn't we either make get_mozharness parameterized, or have a get_mozharness job for nightly, aurora, etc? That's what I would do in pf-jenkins.

@whimboo

This comment has been minimized.

Show comment
Hide comment
@whimboo

whimboo Mar 14, 2016

Contributor

I don't want to let this job only run on master because it would be a bottle-neck. Also all slaves would have to copy the artifacts which takes a couple of seconds due to re-compressing of mozharness. That means fetching the zip archive and unpacking on the slaves is way faster.

Contributor

whimboo commented Mar 14, 2016

I don't want to let this job only run on master because it would be a bottle-neck. Also all slaves would have to copy the artifacts which takes a couple of seconds due to re-compressing of mozharness. That means fetching the zip archive and unpacking on the slaves is way faster.

@sydvicious

This comment has been minimized.

Show comment
Hide comment
@sydvicious

sydvicious Mar 14, 2016

Contributor

Makes sense. I developed my methodology remotely with a low pipe, so having the master fetch everything and the jobs copying from master was much faster. mozmill-ci has a fast pipe for all builders, so this should be fine.

Contributor

sydvicious commented Mar 14, 2016

Makes sense. I developed my methodology remotely with a low pipe, so having the master fetch everything and the jobs copying from master was much faster. mozmill-ci has a fast pipe for all builders, so this should be fine.

@whimboo

This comment has been minimized.

Show comment
Hide comment
@whimboo

whimboo Mar 14, 2016

Contributor

I would do the same if the mozharness archive would be larger. But it's really only some kB so not worth of any kind of caching and such on the master. Also keep in mind that we have a much higher number of jobs we run due to releases and all the locales.

Contributor

whimboo commented Mar 14, 2016

I would do the same if the mozharness archive would be larger. But it's really only some kB so not worth of any kind of caching and such on the master. Also keep in mind that we have a much higher number of jobs we run due to releases and all the locales.

whimboo added a commit to whimboo/mozmill-ci that referenced this issue Mar 14, 2016

@whimboo

This comment has been minimized.

Show comment
Hide comment
@whimboo

whimboo Mar 14, 2016

Contributor

PR #768 landed on master and I will push it to staging and production ASAP.

Contributor

whimboo commented Mar 14, 2016

PR #768 landed on master and I will push it to staging and production ASAP.

@whimboo

This comment has been minimized.

Show comment
Hide comment
@whimboo

whimboo Mar 14, 2016

Contributor

Somehow this merge is marked as broken by Travis:
https://travis-ci.org/mozilla/mozmill-ci/builds/115903987

Not sure how this change came in?

-      <colorMapName>xterm</colorMapName>
+      <colorMapName></colorMapName>

I definitely didn't change it and also cannot reproduce this failure locally. A retrigger of the Travis build also fixed it.

Contributor

whimboo commented Mar 14, 2016

Somehow this merge is marked as broken by Travis:
https://travis-ci.org/mozilla/mozmill-ci/builds/115903987

Not sure how this change came in?

-      <colorMapName>xterm</colorMapName>
+      <colorMapName></colorMapName>

I definitely didn't change it and also cannot reproduce this failure locally. A retrigger of the Travis build also fixed it.

@whimboo

This comment has been minimized.

Show comment
Hide comment
@whimboo

whimboo Mar 14, 2016

Contributor

This change is now active on staging and production.

Contributor

whimboo commented Mar 14, 2016

This change is now active on staging and production.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.