Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Python 3.9 to supported versions #11950

Closed
wants to merge 8 commits into from

Conversation

r-richmond
Copy link
Contributor


Add python 3.9 to the test matrix and supported version list

Read the Pull Request Guidelines for more information.
In case of fundamental code change, Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in UPDATING.md.

@boring-cyborg boring-cyborg bot added area:dev-tools provider:Apache provider:google Google (including GCP) related issues labels Oct 29, 2020
@github-actions
Copy link

The Workflow run is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static checks,^Build docs$,^Spell check docs$,^Backport packages$,^Provider packages,^Checks: Helm tests$,^Test OpenAPI*.

@github-actions
Copy link

The Workflow run is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static checks,^Build docs$,^Spell check docs$,^Backport packages$,^Provider packages,^Checks: Helm tests$,^Test OpenAPI*.

@mik-laj
Copy link
Member

mik-laj commented Oct 29, 2020

We have limited resources on CI and I am concerned that adding a new version may make this problem even worse. If we want to support a new version of Python, we should drop the other version from CI. On the other hand, versions 3.6, 3.7 and 3.8 are the most used versions and we should have tested them well.

@r-richmond
Copy link
Contributor Author

So currently this pr adds 3.9 to the test matrix which increases it by 33%. If CI resources are a major concern what if instead I modified this pr so that the test matrix only includes the earliest and latest versions (3.6 & 3.9)

That would result in a reduction of CI resources by 33% instead of increase of 33%. Letting tests run faster for all.

Given python's deprecation schedule I believe it is a reasonable assumption that anything that runs on min & max version will run on the versions between as well.

@potiuk
Copy link
Member

potiuk commented Oct 30, 2020

Given python's deprecation schedule I believe it is a reasonable assumption that anything that runs on min & max version will run on the versions between as well.

Unfortunately, this does not work. We've had a number of python 3.6-only problems in the past where things were working for 3.5 and 3.7. Mainly because of making assumptions on sorting dictionary items and similar (implicit) behaviours that developers assumed were working on all python versions.

However, this is not really needed. We've introduced a recent change that all PRs by definition are run on default values of matrix - until they are changing Core of airflow in which case they are automatically marked as "full tests needed" when approved by the committer and only then full matrix of tests will be run - we are testing it right now (but it seems to work pretty well).

After the merge, the full set of tests will be run, so we are going to catch any incompatibilities "outside" the core as well. See #11828 (comment)

@potiuk
Copy link
Member

potiuk commented Oct 30, 2020

We have limited resources on CI and I am concerned that adding a new version may make this problem even worse. If we want to support a new version of Python, we should drop the other version from CI. On the other hand, versions 3.6, 3.7 and 3.8 are the most used versions and we should have tested them well.

@mik-laj -> see above comment. The change I added actually makes it perfectly possible to add a new version of python here.

@kaxil
Copy link
Member

kaxil commented Nov 4, 2020

Can you please rebased your PR on latest Master since we have applied Black and PyUpgrade on Master.

It will help if your squash your commits into single commit first so that there are less conflicts.

@r-richmond
Copy link
Contributor Author

Can you please rebased your PR on latest Master since we have applied Black and PyUpgrade on Master.
It will help if your squash your commits into single commit first so that there are less conflicts.

Ok rebased. Last time I believe it failed on mermaid checks and I got lost going down the rabbit hole.

What is the easiest way to ensure that my pr passes all the static checks (black, flake8, mermaid, etc...) ?

@github-actions
Copy link

github-actions bot commented Nov 5, 2020

The Workflow run is cancelling this PR. It has some failed jobs matching ^Pylint$,^Static checks,^Build docs$,^Spell check docs$,^Backport packages$,^Provider packages,^Checks: Helm tests$,^Test OpenAPI*.

@potiuk
Copy link
Member

potiuk commented Nov 5, 2020

Can you please rebased your PR on latest Master since we have applied Black and PyUpgrade on Master.
It will help if your squash your commits into single commit first so that there are less conflicts.

Ok rebased. Last time I believe it failed on mermaid checks and I got lost going down the rabbit hole.

What is the easiest way to ensure that my pr passes all the static checks (black, flake8, mermaid, etc...) ?

Just install pre-commit and run it with --all-files. See https://github.com/apache/airflow/blob/master/STATIC_CODE_CHECKS.rst

  • You can run all checks with pre-commit run --all-files -> this will take about 10-15 minutes depends on the speed of your computer and network and whether you have breeze image recently built
  • You can run individual pre-commits with pre-commit run <pre-commit> --all-files
  • For all the individual changes pre-commit will automatically run at every commit only for subset of changes when you install it with pre-commit install
  • You can also use breeze. Breeze has the benefit of using auto-complete so that you do not have to look up the ids of pre-commits. You can run it as ./breeze static-checks <TAB> and you see the ids of pre-commit. You can also add extra options for example breeze static-checks mypy -- --all-files will run checks for mypy for all files.

@r-richmond
Copy link
Contributor Author

Ty for the tips. Looks like pre-commit got me through the mermaid errors however the last run failed with the following and I'm not sure where to go from there.

Run actions/upload-artifact@v2
With the provided path, there will be 1 file(s) uploaded
Total size of all the files uploaded is 55330 bytes
Error: read ECONNRESET

p.s. I added 3.9 to the supported version for 2+ but did not add it to the 1.10 series. Was that correct or should I add it as supported for 1.10.13?

@potiuk
Copy link
Member

potiuk commented Nov 6, 2020

Ty for the tips. Looks like pre-commit got me through the mermaid errors however the last run failed with the following and I'm not sure where to go from there.

That looks like a transient error. I think you can mark it as "ready for review" and rebase to latest master

@r-richmond r-richmond marked this pull request as ready for review November 7, 2020 19:35
@r-richmond
Copy link
Contributor Author

That looks like a transient error. I think you can mark it as "ready for review" and rebase to latest master

Done

@kaxil kaxil changed the title Add python 3.9 to supported versions Add Python 3.9 to supported versions Nov 7, 2020
@potiuk
Copy link
Member

potiuk commented Nov 17, 2020

Look good to me. I will just approve it, get the "full-tests-needed" label

@potiuk potiuk added the full tests needed We need to run full set of tests for this PR to merge label Nov 17, 2020
@r-richmond r-richmond force-pushed the test-3.9 branch 2 times, most recently from 3acc6d3 to 840e0d7 Compare November 19, 2020 20:24
@r-richmond
Copy link
Contributor Author

Look good to me. I will just approve it, get the "full-tests-needed" label

Assuming the tests past am I ok to merge?

@potiuk
Copy link
Member

potiuk commented Nov 19, 2020

Assuming the tests past am I ok to merge?

🤞

@potiuk
Copy link
Member

potiuk commented Nov 19, 2020

I also prepared the DockerHub configuration in the meantime to support "dockerhub" image builds so yeah - I hope it will be ok :)

@potiuk
Copy link
Member

potiuk commented Feb 23, 2021

Unfortunately, I tried to get the 3.9 build working but to no avail.

We are using snowflake connector which has a very narrow requirements for pyarrow -> 0.17.1 is the latest version supported. And this version requires numpy == 1.16.0 which does not have python 3.9 version and does not even compile for python 3.9 because it uses tp_print enum which has been removed from Python 3.9 (https://bugs.python.org/issue39361)

error: ‘PyTypeObject’ {aka ‘struct _typeobject’} has no member named ‘tp_print’; did you mean ‘tp_dict’?

So unless we can upgrade the snowflake connector to use newer pyarrow version, we cannot add python 3.9.

The good news is that I am a Snowflake employee now and will try to make it happen :)

@r-richmond
Copy link
Contributor Author

First off thank you for taking another stab at this. Turns out its a ton of work to get the next version working...

We are using snowflake connector which has a very narrow requirements for pyarrow -> 0.17.1 is the latest version supported.

Pardon my ignorance here but is there no way to specify python versions for certain providers?
i.e. it seems sub optimal to require that every provider support the new python version before airflow can be updated as that could lead to less popular providers blocking the rest of the community.

One historical example of a single package/provider causing longterm python version issues was the snakebite package issue which caused all sorts of python 3 headaches.

The good news is that I am a Snowflake employee now and will try to make it happen :)

Congrats! That also definitely gives some hope for updating the snowflake connector :)

@potiuk
Copy link
Member

potiuk commented Feb 23, 2021

i.e. it seems sub optimal to require that every provider support the new python version before airflow can be updated as that could lead to less popular providers blocking the rest of the community.

Yeah. It is suboptimal. And we already had a number of problems because of that. But the way it works currently is that - unforatunately - all providers and core run in one python env and this means they have to share dependencies. Not much we can do now, and while we ha some ideas how to improve that, this is - in most cases - not as big problem as it seems. In most cases we have lower-bound dependencies (rather than upper-bound) and different providers have only small subset of overlapping dependencies.

So there are just a few providers that are problematic - and snowflake is one of them. The benefits of not having to introduce isolation between providers are important - simplicity, debuggability, easy deployment, easy management of envs, upgrades etc. Are important. and we developed the whole CI process to keep the "golden" dependencies up-to-date.

There are really two problematic providers/dependencies:

  • apache-beam
  • snowflake

If we can help to sort them out, we should be really good - without having to introduce another approach. Taking into account that 3 of Airflow Committers /PMC members are now @ Snowflake - I think we can improve the former. And 3 committers of beam are also with us. so.......

@ashb
Copy link
Member

ashb commented Feb 23, 2021

i.e. it seems sub optimal to require that every provider support the new python version before airflow can be updated as that could lead to less popular providers blocking the rest of the community.

Yeah. It is suboptimal. And we already had a number of problems because of that.

The point about all operators running in one env is not @r-richmond's point -- more that all the other providers can't be marked as supporting Py3.9 until every provider does.

If this isn't fixed soon, we should have a way of excluding certain providers from running tests on some python versions, so that we could release 3.9 support for everything else.

@potiuk
Copy link
Member

potiuk commented Feb 23, 2021

Yep. I think the same. If we cannot make snowflake and beam py3.9 compliant, we should exclude them from community managed providers. I hope it will not happen and this thread is an important input to at least one of the discussions I am going to have about it. SOON

@potiuk
Copy link
Member

potiuk commented Feb 24, 2021

I wish I could tell that I helped, but I did not not, however soon the problem will be solved: snowflakedb/snowflake-connector-python@ef70176#diff-60f61ab7a8d1910d86d9fda2261620314edcae5894d5aaa236b821c7256badd7

I will double check what are the plans of releasing this version, but once it is, I think we will be able to support 3.9 :)

@potiuk
Copy link
Member

potiuk commented Feb 24, 2021

The library is planned to be relased next week (I know it from a good source) :). So looks like Python 3.9 support soon :)

@kaxil
Copy link
Member

kaxil commented Feb 24, 2021

+1 -- We can skip the test for the entire-provider for that specific Python version and add some sort of meta information for setup.py for providers -- Since I can see this causing issues in future too.

(or remove it from Airflow repo entirely)

@potiuk
Copy link
Member

potiuk commented Feb 24, 2021

This is not only tests - those are dependencies. Which would mean that if we want to skip particular provider you'd have to skip it entirely
Right now we have a 'blessed' set of dependencies for our CI build. We should exclude all.such 'bad operators' when it happens from even that. We already do that for Apache Beam. But i think for Snowflake it will be solved next week so issue solved

@kaxil
Copy link
Member

kaxil commented Feb 24, 2021

+1

@r-richmond
Copy link
Contributor Author

Not so subtle bump... Any updates here? Looks like snowflake has released a couple versions so that blocker would be gone now if I've followed everything correctly.

@potiuk
Copy link
Member

potiuk commented Mar 14, 2021

Indeed snowflake is now bumped and I believe it is not a blocker any more.

I built the image recently and unfortunately I think we have another dependency problem - pyarrow/numpy which we are using in a version that has no binary packages built for python 3.9. Possibly that can be fixed by building the older versions of those libraries, but I think ultimately it means that we need to either get rid of the requirments from airflow core (which I believe are still there) or upgrade to newer versions of those. I think @ashb was looking at that at some point in time. Maybe it's even done already.

To be honest (and to get the expectations right) - python 3.9 support is rather low priority, I am looking at it usually somewhere between other tasks when I am waiting for CI etc. So maybe you would like to contintue taking a look at it yourself @r-richmond ? We are dealing now with much higher priority task - speeding the tests for everyone by 2x - 6x times.

I am happy to help if you just try to run it and do some initial investigations. What we really need to achieve is to have a CI image built with this command (--build-cache-local is there so that you can locally iterate on the image without pulling/using caches from remote):

./breeze build-image --upgrade-to-newer-dependencies --python 3.9 --build-cache-local

So if you would like to try to build it and see if the problems can be - maybe you can try and see if those problems I mentioned can be fixed.

WDYT @r-richmond ?

@ashb
Copy link
Member

ashb commented Mar 14, 2021

I think @ashb was looking at that at some point in time. Maybe it's even done already.

Not yet, I got as far as creating #12500 about it but haven't started working on it. The only "difficult" bit is working out what this means for the DbApiHook method that returns a dataframe.

* master: (189 commits)
  Fixes case where output log is missing for image waiting (apache#14784)
  Only rebuilds base python image when upgrading to newer deps (apache#14783)
  Better diagnostics for image waiting (apache#14779)
  Fixes limits on Arrow for plexus test (apache#14781)
  Add script to verify that all artefacts are in svn (apache#14777)
  Add minimum version of pylint (apache#14775)
  Refactor info command to use AirflowConsole (apache#14757)
  Add Helm Chart logo to docs index (apache#14762)
  Add elasticsearch to the fixes of backport providers (apache#14763)
  Update documentation for broken package releases (apache#14734)
  Prepare for releasing Elasticsearch Provider 1.0.3 (apache#14748)
  Excludes .git-modules from rat-check (apache#14759)
  Fixes force-pulling base python images (apache#14736)
  Rename DateTimeBranchOperator to BranchDateTimeOperator (apache#14720)
  Remove Heisentest category and quarantine test_backfill_depends_on_past (apache#14756)
  `./breeze stop` is not necessary for new comers apache#14752 (apache#14753)
  Add confirming getopt and gstat apache#14750 (apache#14751)
  Small fixes in provider preparation docs (apache#14689)
  Fix attributes for AzureDataFactory hook (apache#14704)
  Refactor Taskflow decorator for extensibility (apache#14709)
  ...
@r-richmond
Copy link
Contributor Author

r-richmond commented Mar 14, 2021

So you mentioned this before but theres a bootstrap issue of some sort here right?
running
./breeze build-image --upgrade-to-newer-dependencies --python 3.9 --build-cache-local
results in
Error response from daemon: manifest for apache/airflow:python3.9-master not found: manifest unknown: manifest unknown
Which makes sense since this is the pr trying to add 3.9...

Side question has there been any discussion on switching to another package management solution like pipenv or poetry. The reason I ask is it feels like there has been a lot of errors around package version conflicts slipping in lately. By using one of the above solutions airflow would have better management over this and it would prevent invalid package management changes from slipping in.

example of one I stumbled across while trying to fool with this was
<debug>PackageInfo:</debug> Invalid constraint (googledatastore>=7.0.1,<7.1; python_version < "3.0" ; extra == "gcp") found in apache-beam-2.9.0 dependencies, skipping
another example was #13093

@potiuk
Copy link
Member

potiuk commented Mar 14, 2021

results in
Error response from daemon: manifest for apache/airflow:python3.9-master not found: manifest unknown: manifest unknown
Which makes sense since this is the pr trying to add 3.9...

I believe this can be fixed by setting SKIP_CHECK_REMOTE_IMAGE to true - there we should skip checking if the remote image exists.

Side question has there been any discussion on switching to another package management solution like pipenv or poetry. The reason I ask is it feels like there has been a lot of errors around package version conflicts slipping in lately. By using one of the above solutions airflow would have better management over this and it would prevent invalid package management changes from slipping in.

Oh yeah. we tried all that. And about 2 years ago I wrote the email on the devlist which was pretty much "Let's use poetry - it will solve all our problems". That was 2 years ago.

And those tools were all discarded as they did not work well in our complex case. We even have an entry specific about pip-tools and poetry in here https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#airflow-dependencies together with explanation why they would not work. Here is more context on why those tools do not work: https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#dependency-management

There is nothing magical with what pipenv or poetry does. Both 'poetry' and 'pip-tools' are rather more opinionated than PIP. And that makes them actually less suitable for Airflow - airflow is both a library and application and neither pip-tools nor poetry handle the case well). We had lots of discussions about this in the devlist.

I'd really love to see the solution of the problem with other tools where:

  • you do not limit airflow with 'install-requires' whtere you can freely keep a set of few hudred libraries that you can upgrade at will
  • but you can install the latest set of airflow + providers and be sure they are working together in a consistent way (no matter if some transitive dependencies were released recently).

Happy to brainstorm on that - but rather that 'use XXX' I would love to see that someone implements it. I.e. "this is how the tool solves this two problems:

a) a user wants to install released version of Airlfow (say 2.0.2) with set of providers/extras (for example google/snowflake/amazon) and they got it working with non-conflicting requirements
b) a user wants to upgrade most dependencies to a newer version as they need to write a DAG that uses newer version of the dependency.

So far poetry/pip-tools can work well with a) but they fail miserably when you add b) to the mix. But I am happy to hear how it can be done :)

example of one I stumbled across while trying to fool with this was
<debug>PackageInfo:</debug> Invalid constraint (googledatastore>=7.0.1,<7.1; python_version < "3.0" ; extra == "gcp") found in apache-beam-2.9.0 dependencies, skipping
another example was #13093

@r-richmond
Copy link
Contributor Author

r-richmond commented Apr 25, 2021

@potiuk

On python 3.9 I see you've opened #15515. Should I rebase this PR on master or did you want to take over this pr with 15515?

"Let's use poetry - it will solve all our problems". That was 2 years ago. ...
Happy to brainstorm on that - but rather that 'use XXX' I would love to see that someone implements it. I.e. "this is how the tool solves this two problems:

Ty for the background; Given that I understand why airflow won't switch off of pip. That said what do you think of instead introducing tests that use poetry or pipenv to ensure that the requirements we have are always valid. i.e. the tests would run lock to make sure no conflicting requirements slip into airflow.

@potiuk
Copy link
Member

potiuk commented Apr 26, 2021

For Python 3.9 I think I will simply go with the #15515 as in order to merge it there will be one more "tricky" point which is to make sure that Python 3.9 is also built in the "build images" workflow. Once this will be merged, I think it will have all the changes from that PR so I think we will be able to close this one (I will be happy to make you co-author of the #15515 :).

Ty for the background; Given that I understand why airflow won't switch off of pip. That said what do you think of instead introducing tests that use poetry or pipenv to ensure that the requirements we have are always valid. i.e. the tests would run lock to make sure no conflicting requirements slip into airflow.

PIP 21 (PR to be merged in #15513) already has a working resolver that is detecting conflicts much better and we will continue using it to prevent conflicts once it is merged. So this problem will be solved. I think once teething problems of the new resolver have been solved and we will be able to solve all the current problems (with #15513). I think the new PIP removes any need for any extra tools like poetry or pipenv (and is much more versatile IMHO). @uranusjr maybe you can comment on that one.

The problem we have (and it is also described in https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#airflow-dependencies, Airflow is both library and application. And constraint mechanism allows us to handle it well.

  • On one hand we can keep the dependencies open in setup.py (so that the users of airflow are not limited and can upgrade dependencies after Airflow is released)
  • At the same time, with the --constraint switch and publishing "golden" set of constraints we are able to make sure that users installing released version of Airflow can also install it in the future.

AFAIK (and please correct me if I am wrong) neither Poetry nor PIPEnv bring anything more, actually the workflows they support are limiting. They are more opinionated and allow either the library approach (everything as open as possible) or the application one (everything fixed with poetry/pipenv .lock file). From what I understand, the information about dependency versions from those .lock files is not available in the PyPI after packages are released. You either "bake it" into the package (for application) or strip it (for library). Unlike PIP, as far as I know neither Poetry nor Pyenv has the remote URL for lock-files capability that --constraint flag of PIP gives us. With Pip we can take "open-dependency" Airflow and point to the "fixed" constraints that we store in an orphaned branch in our GitHub Repo and voila - open-dependency "Airflow" is installed in a repeatable way with "golden" set of dependencies.

However I am really open to add "support" for people who are using Poetry or PIPenv. I know there are many users of poetry/pipenv out there. If you are one of the poetry/pipenv users and could write a nice HowTo - how to take Airflow constraints and install airflow with Poetry and Pipenv I am all ears.

It would have to use somehow the constraint file we have - otherwise it will not work in the future with the open-ended approach of Airlfow dependencies. I looked at it briefly, but I could not find a one-line equivalent of pip install apache-airflow[extras]==2.0.2 --constraints https://raw.githubusercontent.com/apache/airflow/constraints-2.0.2/constraints-3.7.txt that we have for PIP installation.

If this means that we will have to store poetry.lock file in our orphan branch next to constraints, I am more than happy to help with automating it.

Would you be willing to help with that and prepare/maintain such "instructions" ?

@r-richmond
Copy link
Contributor Author

For Python 3.9 I think I will simply go with the #15515 as in order to merge it there will be one more "tricky" point which is to make sure that Python 3.9 is also built in the "build images" workflow. Once this will be merged, I think it will have all the changes from that PR so I think we will be able to close this one (I will be happy to make you co-author of the #15515 :).

In that case I'll close this PR out. Thanks for tackling the extra effort for 3.9. As far as co-author goes I've already my Contributor label from other PRS so no need to add me as a co-author for that one. Besides you did a lot of work in this pr as well.

PIP 21 (PR to be merged in #15513) already has a working resolver that is detecting conflicts much better and we will continue using it to prevent conflicts once it is merged

Oh I'm not familiar with the new resolver. As long as it can detect conflicts this would address my feedback / concern.

AFAIK (and please correct me if I am wrong) neither Poetry nor PIPEnv bring anything more, actually the workflows they support are limiting. They are more opinionated and allow either the library approach (everything as open as possible) or the application one (everything fixed with poetry/pipenv .lock file)

However I am really open to add "support" for people who are using Poetry or PIPenv. I know there are many users of poetry/pipenv out there. If you are one of the poetry/pipenv users and could write a nice HowTo - how to take Airflow constraints and install airflow with Poetry and Pipenv I am all ears.

So I'm no expert and my point of view is very heavy from the Application side. So from that point of view the big thing pipenv does is take open-ended requirements and spit out an explicit list of things installed. The nice thing about this is you can pin versions in the Pipfile and leave others open which is much more flexible than a normal frozen requirements. (And of course the big thing reproducible builds). The only thing fragile here is that pipenv and poetry won't let you lock if an invalid set of requirements slip into airflow, Historically this has happened several times #13093 #14145 #13558 with airflow but if as you said above the new Pip resolver detects these we should be good.

Anyways I could write a small guide on using Pipenv with airflow. Whats interesting is that there is nothing special about airflow + pipenv so I don't know how much of it would be novel but if you think it would help I'd be happy to. Where do you think it should go in the repo?

@r-richmond r-richmond closed this Apr 26, 2021
@potiuk
Copy link
Member

potiuk commented Apr 27, 2021

Anyways I could write a small guide on using Pipenv with airflow. Whats interesting is that there is nothing special about airflow + pipenv so I don't know how much of it would be novel but if you think it would help I'd be happy to. Where do you think it should go in the repo?

Cool. I think the best thing is about the installation https://github.com/apache/airflow/blob/master/docs/apache-airflow/installation.rst . I believe this is the best place as we are talking mostly about the "user" side. I.e. we are not talking about poetry/pipenv used during development of airflow, but about installing airflow from PyPI (via poetry/pipenv). This is the main place where people get to know how they are installing airflow as users.

There is potentially second doc: https://github.com/apache/airflow/blob/master/INSTALL which is the "low-level" installation from sources. This one that always have to be there because PyPI release is only the "convenience" installation packge - the only "official" release of Airlfow is at https://downloads.apache.org/airflow/ and this INSTALL describes how to install Airflow from the sources. But since this is a "power user" doc, I think leaving PIP as the only installation method is quite OK - the less "freedom" we have there, the better.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:dev-tools full tests needed We need to run full set of tests for this PR to merge provider:google Google (including GCP) related issues
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants