Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optimize collections sync #172

Merged
merged 1 commit into from
Aug 16, 2019
Merged

Optimize collections sync #172

merged 1 commit into from
Aug 16, 2019

Conversation

fao89
Copy link
Member

@fao89 fao89 commented Aug 15, 2019

closes #5236

@fao89 fao89 requested a review from a team August 15, 2019 22:40
expected = ANSIBLE_COLLECTION_FIXTURE_SUMMARY
expected["ansible.collection_version"] = 2
self.assertDictEqual(get_added_content_summary(repo), expected)
self.assertDictEqual(get_added_content_summary(repo), ANSIBLE_COLLECTION_FIXTURE_SUMMARY)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What about we update the usage of assertDictEqual? We could use another assertion a bit more strict. Removing the need of mock.ANY, for instance assertGreaterEqual(a, b). Since the end goal is to assure that content was synced properly, but we do not control that URL being synced.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sounds good to me

@bmbouter
Copy link
Member

@fabricio-aguiar I found the issue associated w/ this can you associate w/ the commit and add a bugfix release note? https://pulp.plan.io/issues/5236

# Concurrent downloads are limited by aiohttp...
not_done = set(
remote.get_downloader(url=_get_url(page)).run() for page in range(2, page_count + 1)
remote.get_downloader(url=_get_url(page)).run() for page in range(1, page_count + 1)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This line could be broken up for readability.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Black requires to put on the same line:

--- pulp_ansible/app/tasks/collections.py       2019-08-16 15:21:52.437710 +0000
+++ pulp_ansible/app/tasks/collections.py       2019-08-16 15:22:02.078186 +0000
@@ -258,12 +258,11 @@
             progress_bar.total = count
             progress_bar.save()
 
             # Concurrent downloads are limited by aiohttp...
             not_done = set(
-                remote.get_downloader(url=_get_url(page)).run()
-                for page in range(1, page_count + 1)
+                remote.get_downloader(url=_get_url(page)).run() for page in range(1, page_count + 1)
             )
 
             while not_done:
                 done, not_done = await asyncio.wait(not_done, return_when=asyncio.FIRST_COMPLETED)
                 for item in done:
would reformat pulp_ansible/app/tasks/collections.py
All done! 💥 💔 💥

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree I was thinking of something like:

not_done = set()
for page in range(1, page_count + 1):
    downloader = remote.get_downloader(url=_get_url(page))
    not_done.add(downloader.run())

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is great, thanks @bmbouter

Copy link
Member

@bmbouter bmbouter left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks great! Thanks @fabricio-aguiar !

@bmbouter bmbouter merged commit 5f7b5f7 into pulp:master Aug 16, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants