Skip to content
This repository was archived by the owner on Oct 21, 2022. It is now read-only.

update whitelist with opencensus extension packages#329

Merged
ylil93 merged 6 commits intoGoogleCloudPlatform:masterfrom
ylil93:master
Jun 10, 2019
Merged

update whitelist with opencensus extension packages#329
ylil93 merged 6 commits intoGoogleCloudPlatform:masterfrom
ylil93:master

Conversation

@ylil93
Copy link
Copy Markdown
Contributor

@ylil93 ylil93 commented Jun 7, 2019

No description provided.

@ylil93 ylil93 requested a review from liyanhui1228 June 7, 2019 23:16
'grpc-google-iam-v1',
'grpcio',
'opencensus',
'opencensus-correlation',
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This change is adding 23 more packages to the whitelist, and previously we have 59 packages, looks like this will have a huge impact on the total time it takes to complete a run. I'm not worried about the PyPI check jobs, as it only takes <10mins to complete. But the github check jobs will possibly exceed the timeout(30 mins) if we do nothing but just adding those packages to the whitelist, as currently it takes 15-20mins to complete a run.

One approach I could think of to reduce the total time added up by the new packages is similar to this. As all the newly added packages are opencensus extensions, which all live under the opencensus github repo, it will save us a lot time if we git clone the opencensus repo just once, because github won't actually clone a sub-directory in a repo as we specified, it actually clones the entire repo with all the revisions every time. And it will make a lot sense to monitor the kokoro job status for some time to make sure this change is working as expected.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's a good point - is the bottle neck just in the cloning or does the actual testing resource heavy as well?

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The testing resource (I assume you mean the github repo?) will grow heavier over time, as github will by default cloning all the revisions for that repo. And that will multiply by 24 as we have 24 packages living under that same opencensus repo. I think it's not that heavy for now as the opencensus extensions packages are created recently, but they will grow heavy ultimately with a lot revisions.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It will make sense to run the loadtest by deploying this to our cloud project for test before actually deploying to prod server. The get_compatibility_data.py script which the kokoro jobs runs can be run locally for recording the total time it takes for complete a run.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will do a compat lib release before merging this PR so that it will be easier to roll back any resource issues.

'opencensus-ext-django',
'opencensus-ext-flask',
'opencensus-ext-gevent',
'opencensus-ext-google',
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't find any package named this on PyPI.

Copy link
Copy Markdown
Contributor Author

@ylil93 ylil93 Jun 10, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this was a typo as it should have been opencensus-ext-google-cloud-clientlibs which was correct for the github ones - fixed this.

@ylil93 ylil93 merged commit 0de6ad1 into GoogleCloudPlatform:master Jun 10, 2019
client_repo_directory = self._clone_repo(
container, CLIENTLIBS_GITHUB_URL)
elif any(GITHUB_OPENCENSUS_PREFIX in pkg for pkg in self._packages):
client_repo_directory = self._clone_repo(
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is problematic when the self._packages contains both opencensus and google_cloud_python packages, as only client_repo_directory variable is used in here. Not sure whether fixing this will fix the kokoro timeout issue, but it's definitely something we need to fix.

elif GITHUB_OPENCENSUS_PREFIX in pkg:
install_subdirectory = pkg.split(GITHUB_OPENCENSUS_PREFIX)[1]
install_names.append(
os.path.join(client_repo_directory, install_subdirectory))
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also need to fix here.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants