Skip to content
This repository has been archived by the owner on Jan 19, 2022. It is now read-only.

Bug 1154795 - add an endpoint to relengapi that creates and returns a Mozharness archive via S3 #279

Merged
merged 10 commits into from
Jul 1, 2015

Conversation

lundjordan
Copy link
Contributor

this consolidates my relengapi repo into the now unified build-relengapi and addresses feedback comments regarding:

  • hardcoded repo and subdir: now supports any hg repo, branch, and subdir configuration
  • hardcoded regions: regions are in the cfg
  • blueprints no longer copy skeleton repo style of blueprints/name_of_blueprint.py and instead uses blueprints/name_of_blueprint/{init.py, etc}
  • my settings.py cfg uses https not http
  • I ended up downloading the src archive to disk but in a tempfile

from feedback that is not implemented:

  • I did not end up using celery subtasks as I felt it may be overkill for the situation
  • I also did not use celery's retry because in the event of a FAILURE, the current logic has it that if the s3 key being requested does not exist, the task to download and upload it will just be re-created again anyway. The need to support retry didn't seem overly important due to this.

@lundjordan
Copy link
Contributor Author

hrm, guess I need to write some docs and tests first :)

will squash and update again

@lundjordan lundjordan closed this Jun 9, 2015
@lundjordan lundjordan reopened this Jun 16, 2015
@lundjordan
Copy link
Contributor Author

hrm, validate (which runs coverage) passes for me locally. I'm not sure if a -0.01% decrease in coverage should be held against me too harshly :)

latest pull request contains docs, tests, and pep fixes that was failing in previous PR

@djmitche
Copy link
Contributor

True, but we do want to make sure that new code is 100% covered, among other things in https://github.com/mozilla/build-relengapi/blob/master/relengapi/docs/development/contributing.rst

The rationale is that such testing is basically the only thing that could prevent a deploy of relengapi from breaking this service in production -- with continuous deployment, we won't have the luxury of manually clicking around on things that we feel our push may have broken. So if there's code not covered by tests, then that code could be silently broken by some unrelated refactor, and we wouldn't know until the trees closed.

In this case, it looks like task_status is untested, as is the retry handling for the celery task.

@djmitche
Copy link
Contributor

Having a deeper review look at this now..

For AWS credentials, each bucket should be limited to the AWS IAM role corresponding to the AWS credentials. Buckets in
the configuration are required to be pre-existing.

Finally, Archiver avails of Celery. You will need to provide a broker and back-end.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

s/avails of/uses/

def test_accepted_response_when_missing_s3_key(app, client):
setup_buckets(app, cfg['SUBREPO_MOZHARNESS_CFG'])
resp = client.get('/archiver/mozharness/9ebd530c5843?repo=mozilla-central&region=us-east-1')
eq_(resp.status_code, 202, resp.status)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This hits hg too, doesn't it.. it just doesn't fail because the test doesn't poll the task status..

@djmitche
Copy link
Contributor

Sorry, that got really long. This is in good shape, and most of these comments are relatively minor changes. I think the only controversial idea is to just accept any old path, tack it onto https://hg.mozilla.org/, download it, cache it, and 302 to it.

I don't remember if we talked about how these files will be deleted from S3 -- a lifetime configuration on the S3 bucket? If that's the case, are you worried about the race condition where S3 deletes a key in between its existence being checked and the client following the 302 redirect?

supports subdir query arg and is more generic than mozharness. copying raw gzip response not working

fixes gzip response, rewrites tests, docs, and addresses pep errors
force response decode, update doc usage to reflect changes
}

ARCHIVER_S3_BUCKETS = {
'us-east-1', 'archiver-us-east-1',
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Change , to :

@djmitche
Copy link
Contributor

Two one-character changes in the docs and this is good to go :)

lundjordan added a commit that referenced this pull request Jul 1, 2015
Bug 1154795 - add an endpoint to relengapi that creates and returns a Mozharness archive via S3
@lundjordan lundjordan merged commit 77beb1e into mozilla:master Jul 1, 2015
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants