-
Notifications
You must be signed in to change notification settings - Fork 5
Publishing Public Suffix List as a DAFSA binary #373
Conversation
|
@leplatrem I think I have settled on a general flow of how the script will execute, there's more stuff left to handle but just comment if this execution pattern is fine. def publish_dafsa(): # Script entry point
client = Client(server_url=SERVER, auth=CREDENTIALS) # Create a new client
latest_hash = get_latest_hash(COMMIT_HASH_URL) # Fetch the latest hash of psl repository
record = client.get_record(id=RECORD_ID, bucket=BUCKET_ID, collection=COLLECTION_ID) # Fetch the record from kinto server
if record_exists(record): # Check if a valid record exists already
if record["data"]["latest-commit-hash"] == latest_hash: # Compare the two hashes, new and stored
return 0 # Return if true, the rest of the script need not execute
else:
make_dafsa_and_publish(client, latest_hash) # Updating the record, make the dafsa and publish the record
else:
make_dafsa_and_publish(client, latest_hash) # Making a new record, make the dafsa and publish the recordThe |
Yes, it's good! It could even be simplified. The two
|
leplatrem
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It starts to look a letter better!
Some new comments :)
…th bucket/collection
…ew gitub api link
|
I ran the whole script for the first time, and it's working !!! |
|
For now I'm struggling with the test for kinto client, since most of this code are not separate functions with explicit exports, I'm having to rewrite the most of this logic again and test incrementally, is that fine? |
I don't think so, based on what I see you should not have to rewrite anything of what you have here. I believe that you didn't start it on the right foot. The tests code should not duplicate anything from the feature code, or as little as possible. You could have 2 main test suites:
You should probably take some time to understand how tests work in general with the You can read this also http://blog.mathieu-leplatre.info/your-tests-as-your-specs.html |
|
Here is an example to test the class TestPrepareDafsa(unittest.TestCase):
def test_exception_is_raise_when_process_returns_non_zero(self):
with tempfile.TemporaryDirectory() as tmp:
with mock.patch("subprocess.Popen") as mocked:
mocked.return_value.returncode = 1
with self.assertRaises(Exception) as cm:
prepare_dafsa(tmp)
self.assertIn("DAFSA Build Failed", str(cm.exception)) |
|
And here is a more complex one, that leverages responses to simulate server responses: import unittest
import tempfile
from unittest import mock
import responses
from commands.publish_dafsa import publish_dafsa, prepare_dafsa, COMMIT_HASH_URL
class TestPublishDafsa(unittest.TestCase):
def setUp(self):
mocked = mock.patch("commands.publish_dafsa.prepare_dafsa")
self.addCleanup(mocked.stop)
self.mocked_prepare = mocked.start()
mocked = mock.patch("commands.publish_dafsa.remote_settings_publish")
self.addCleanup(mocked.stop)
self.mocked_publish = mocked.start()
@responses.activate
def test_prepare_and_publish_are_not_called_when_hash_matches(self):
event = {
"server": "http://fakeserver/v1",
}
record_url = "http://fakeserver/v1/buckets/main-workspace/collections/public-suffix-list/records/tld-dafsa"
responses.add(responses.GET, record_url, json={
"data": {
"commit-hash": "abc",
}
})
responses.add(responses.GET, COMMIT_HASH_URL, json=[{
"sha": "abc"
}])
publish_dafsa(event, context=None)
self.assertFalse(self.mocked_prepare.called)
self.assertFalse(self.mocked_publish.called) |
|
I ran the lambda manually. There was no error, but the attachment was published with a size of 0. Something has to be improved here ;) |
|
Can you run it once again, the scripts are being pulled from a temporary github repo that I just updated with the latest code. |
|
Ok, I re-ran it and Ethan approved it on stage https://settings.stage.mozaws.net/v1/buckets/main/collections/public-suffix-list/records/tld-dafsa |
leplatrem
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That is kind of heavy going... I'm not sure how to help :/
Please ask questions when something is not clear! Experimenting is a great way to learn and that's awesome! But before submit your code, you should also take some time to polish it a minimum in order to preserve the reviewer patience ;)
|
I have refactored record fetching into a separate function and created tests for all of them, that are all passing currently!!! |
leplatrem
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great! 🌟
A last round of polishing and we're good to merge :)
|
I've addressed all the recent concerns that you raised, I suppose we are almost ready to merge now? |
leplatrem
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👏 👏
Well done! We're good to ship!
This pull request will track the progress of the
publish_dafsa.pyscript to upload a binary file containing the data of the public suffix list to the remote settings server.More details about the project can be found here
Tasks
Testing Strategy
def get_latest_hash(url):HTTPErroris raised when response is status is a 404 codedef download_resources(directory, *urls):HTTPErroris raised when response is status is a 404 codedef get_stored_hash(clients):KintoExceptionis raised when record fetching failsdef prepare_dafsa(directory):def remote_settings_publish(client, latest_hash, binary_path):def publish_dafsa(event, context):prepare_dafsa()andremote_settings_publish()are not called when record hashes do matchprepare_dafsa()andremote_settings_publish()are called when record-hash and latest-hash do not match