update whitelist with opencensus extension packages#329
update whitelist with opencensus extension packages#329ylil93 merged 6 commits intoGoogleCloudPlatform:masterfrom
Conversation
| 'grpc-google-iam-v1', | ||
| 'grpcio', | ||
| 'opencensus', | ||
| 'opencensus-correlation', |
There was a problem hiding this comment.
This change is adding 23 more packages to the whitelist, and previously we have 59 packages, looks like this will have a huge impact on the total time it takes to complete a run. I'm not worried about the PyPI check jobs, as it only takes <10mins to complete. But the github check jobs will possibly exceed the timeout(30 mins) if we do nothing but just adding those packages to the whitelist, as currently it takes 15-20mins to complete a run.
One approach I could think of to reduce the total time added up by the new packages is similar to this. As all the newly added packages are opencensus extensions, which all live under the opencensus github repo, it will save us a lot time if we git clone the opencensus repo just once, because github won't actually clone a sub-directory in a repo as we specified, it actually clones the entire repo with all the revisions every time. And it will make a lot sense to monitor the kokoro job status for some time to make sure this change is working as expected.
There was a problem hiding this comment.
That's a good point - is the bottle neck just in the cloning or does the actual testing resource heavy as well?
There was a problem hiding this comment.
The testing resource (I assume you mean the github repo?) will grow heavier over time, as github will by default cloning all the revisions for that repo. And that will multiply by 24 as we have 24 packages living under that same opencensus repo. I think it's not that heavy for now as the opencensus extensions packages are created recently, but they will grow heavy ultimately with a lot revisions.
There was a problem hiding this comment.
It will make sense to run the loadtest by deploying this to our cloud project for test before actually deploying to prod server. The get_compatibility_data.py script which the kokoro jobs runs can be run locally for recording the total time it takes for complete a run.
There was a problem hiding this comment.
I will do a compat lib release before merging this PR so that it will be easier to roll back any resource issues.
| 'opencensus-ext-django', | ||
| 'opencensus-ext-flask', | ||
| 'opencensus-ext-gevent', | ||
| 'opencensus-ext-google', |
There was a problem hiding this comment.
I didn't find any package named this on PyPI.
There was a problem hiding this comment.
this was a typo as it should have been opencensus-ext-google-cloud-clientlibs which was correct for the github ones - fixed this.
| client_repo_directory = self._clone_repo( | ||
| container, CLIENTLIBS_GITHUB_URL) | ||
| elif any(GITHUB_OPENCENSUS_PREFIX in pkg for pkg in self._packages): | ||
| client_repo_directory = self._clone_repo( |
There was a problem hiding this comment.
This is problematic when the self._packages contains both opencensus and google_cloud_python packages, as only client_repo_directory variable is used in here. Not sure whether fixing this will fix the kokoro timeout issue, but it's definitely something we need to fix.
| elif GITHUB_OPENCENSUS_PREFIX in pkg: | ||
| install_subdirectory = pkg.split(GITHUB_OPENCENSUS_PREFIX)[1] | ||
| install_names.append( | ||
| os.path.join(client_repo_directory, install_subdirectory)) |
No description provided.