-
Notifications
You must be signed in to change notification settings - Fork 99
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RFC for dependencies policy #3
Comments
I support this idea. I think that it is totally fine to have a PR sit around while we wait for upstream to get fixed (I used to not do this). I also think that we should be more aggressive in removing the existing dependencies that pull from GitHub. It's been almost a year since you opened the issue. I think it's fine to do a final push to get people to upload their versions, otherwise we disable the integration until someone picks it up and fixes it. |
Actually I don't think we should ever remove integrations (this is why I would prefer not to add "temporary" solutions) I think for abandoned libs we better create our own pypi packages. |
I'd be OK with all of this as well. It'll definitely help to have a written policy we can all reference. If there's disagreements someone can submit a PR against our policy docs rather than having the discussion in each PR. Probably should introduce a PR label for "Waiting for upstream release" or the like. Appreciate the work to write this all up! |
As far as policy goes, we have the platform checklists |
I think nobody is going to argue against the sensible OP. The sticking point is the last one (number 4) that the OP does not answer. For an upstream project that releases just once or twice per year, do we really want each drive-by contributor to create a new unmaintained PyPI project? Maybe what’s missing is just a convincing story about why GitHub dependencies are discouraged in the first place. They are clearly convenient so why is their cost always bigger than their value? |
The cost is that they cannot be cached as we don't know what is inside the zip until it is downloaded. We have added a workaround inside Home Assistant by having people add |
Note that number 4 is not theoretical - |
I've tweeted at him to see if we can get his attention. Otherwise, let's treat it as an abandoned project. |
I've added a new label "Waiting for Upstream" |
I agree. There are also some people they fork everthing that he have at home. Instead to help for Development the exists library, he work on own fork and change this inside home-assistant. After some time he lost the focus on the fork and the origin library going forward. We should also make a part, that we want see the development on the origin library and allow to use a fork only if there is no other way. |
@balloob if you don't object I can create pypi packages under HA account for abandoned libs. |
I think everybody agrees that PyPI is the preferred way to distribute a library. For new libraries, for new integrations and for abandoned upstream libraries, requiring a new PyPI package is more or less a no-brainer. For existing PyPI packages where bugfixes are published within, say, a month, waiting is reasonable. The only real question seems to be point 4: what to do when an upstream project is alive but it has a PyPI release schedule that moves much slower than HA. There are valid reasons for the upstream to behave like that. For that case, I don't think waiting is reasonable. We will not get the bugfix but we will get merge conflicts with changes that do not require the updated package. I also don't think it is good to create a temporary PyPI package. It litters PyPI and it sets us up for extra work once the temporary package is itself abandoned. I also happen to think that it is impolite to the upstream project. In home-assistant/core#12126 (review) I proposed that GitHub requirements pointing at an upstream master branch commit could be okay. This gives us early access to approved fixes without a lot of extra work and it avoids one-off PyPI packages. @balloob That sounds like using manual labor to fix a deficiency in the tools. Sorry, that is not really a convincing story. |
It's not a deficiency in the tools. Pip can't know what's in a zip file. It's not aware that a zip file of a commit from GitHub always contains the same content. Some other reasons zip files are bad:
|
I think I've been convinced. Another good reason to wait for upstream is that if we're pointing at random master commits, there's a bigger chance of introducing new bugs, if we're not using a version that upstream intended to release to the public. I'm OK myself with the simple policy of this point forward all requirements must be in pip. In any of these cases the right solution would be to apply pressure upstream. If the project just has a slow release cycle then so will the hass component. If it's a bug see if you can get a point release published to fix it. |
I think the decision to wait or fork will ultimately be up to the contributor. I'd prefer people wait and try to convince upstream to release sooner, but if a contributor chooses to fork an upstream lib because they're not happy with the governance that's their prerogative, and it isn't really up to hass to police the entire python ecosystem. As long as it's in PyPI it is sufficient for us. |
I'm lost. Frankly, I don't see consistent arguments here. So I will pull out and I will of course follow and encourage the final decision. That's fine, it's a simple rule and honestly I don't think it matters much because most projects do behave sensibly. |
A year without comments, I think we can close this? That year had two hotfix releases which added GitHub dependencies. I feel better about this rule now that I know it to be one that can be broken when needed. |
Yep. We have a couple of forks published to PyPI, all with the |
home-assistant/core#7069 was started long ago as an effort to reduce non-pypi dependencies.
In home-assistant/core#12126 there was a discussion on (temporary) adding such dependency.
I propose the following:
For abandoned libs:
The text was updated successfully, but these errors were encountered: