Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consider doing release candidates #10882

Open
1 task done
tometzky opened this issue Feb 4, 2022 · 18 comments
Open
1 task done

Consider doing release candidates #10882

tometzky opened this issue Feb 4, 2022 · 18 comments
Labels
state: needs discussion This needs some more discussion type: maintenance Related to Development and Maintenance Processes

Comments

@tometzky
Copy link

tometzky commented Feb 4, 2022

What's the problem this feature will solve?

This is a very popular project and used by millions. Still, it is possible that some major version might break functionality or interoperability with other software. This is normal and expected, but when it happens it is stressful for the project users and also for its developers. Also adds hours of work for many people analyzing and fixing their broken builds or project owners developing fixes during inconvenient time.

Describe the solution you'd like

Please consider releasing a -rc1 version a few days before releasing a new minor or major version.

For example, a few days before releasing 22.1.0 please release 22.1.0rc1. So the "pip install --upgrade --pre pip" will install it. Then some users might use "--pre" it for their development builds and might notice any error, report it and allow to fix it in 22.1.0rc2 and so on. When 22.1.0 will be released there's higher probability that there won't be any breaking changes.

Alternative Solutions

Some might prefer to pin pip version to a known working version instead. But this is not a good solution, as this program interacts with online service, and might just break because of the service changes. Also it makes people use unmaintained version after months and years. And passing time makes it harder and harder to upgrade, while skipping over many major releases, further pushing people to use more and more outdated releases.

Additional context

I think if there was a release pip-22.0.0rc1 a few days before pip-22.0.0, and documented recommendation to use "--pre" in development builds, errors related to #10825 might have been avoided. And it was stressful for many, at least based on comments tone there.

Code of Conduct

@tometzky tometzky added S: needs triage Issues/PRs that need to be triaged type: feature request Request for a new feature labels Feb 4, 2022
@notatallshaw
Copy link
Member

You might be interested in some of the discussion here: https://discuss.python.org/t/community-testing-of-packaging-tools-against-non-warehouse-indexes

@potiuk
Copy link
Contributor

potiuk commented Feb 4, 2022

I think the success of this approach deapends on the group of users who will be willing to test such released RC.
I am hppy to volunteer to be added to "pre-release" users to test some of the use cases. We have rather complex dependency systems in Apache Airflow and we "pin" PIP there, but I generally happy to make PRs with new RC version (as long as it is released in PiP which makes it easier to integrate with our CI) and do a number of manual checks around resolver behaviour - also as long as we have a mechanism that will allow to subscribe to receive notifications when such RC is ready.

I am already subscribed to the project notifications, but this might slip through the cracks of about few hundreds of notifications I get a day, but for example we could build such a list o users who can get notified by creating an issue in GitHub and marking them similarly as we do in Airflow when we prepare releases (and in our case we automate such list generation).

Example here: apache/airflow#20208

Happy to help also with automating such list preparation if needed.

@potiuk
Copy link
Contributor

potiuk commented Feb 4, 2022

Maybe we could get here the list of such people who would also be willing to help ? Maybe commenting or "hands up" might be a statement of intent.

@pfmoore
Copy link
Member

pfmoore commented Feb 4, 2022

I'm still not clear why anyone interested couldn't simply install from master to do the testing. At least as an initial approach, that would give us a chance to get a sense of how much use this sort of pre-release testing would be. (And frankly, it would help the pip maintainers to convince ourselves that pre-releases aren't simply wasted effort, which is the lesson we learned from our previous ventures down this road...)

@potiuk
Copy link
Contributor

potiuk commented Feb 4, 2022

I think it's because it's much easier and you want to reduce friction (so that people do it more willingly).

In our case the reason is rather simple. Running ful suite of tests that involves many test cases is as simple as changing version here: https://github.com/apache/airflow/blob/main/Dockerfile#L51 and making a PR.

For others, there might be other reasons: people in corporate environments might simply have git access blocked or not allowed by their corporate rules/firewalls.

Also by organising people around "planned" release and publishing RC and reaching out to them "hey I need your help" you can build stronger social bonding and engagement of people about doing something "together". This is IMHO the most important reason - you stop treating everyone as a individual but try to organise them around common goal. This is a basic principle why communities exist in the first place.

So I think there are very good reasons - both technical and social why "hey you random people, test whenever you want from main" is much worse than "hey, I need your help to test this release candidate in PIP we worked on for the last few months and we need your help".

@potiuk
Copy link
Contributor

potiuk commented Feb 4, 2022

BTW. I know it might sound strange, but I am pretty good in such community organisation, and if only I could get part of it, I am also happy to volunteer part of my time and try to help with engaging people - once you decided to try it.

I am really, the last person to just say "do this, because you should".
I am also happy to help, if only I got a space to do so.

@edmorley
Copy link
Contributor

edmorley commented Feb 4, 2022

Some might prefer to pin pip version to a known working version instead. ... it makes people use unmaintained version after months and years.

People shouldn't just pin and leave it at that. They should pin and enable something like Dependabot - which gives the best of both worlds:
https://docs.github.com/en/code-security/supply-chain-security/keeping-your-dependencies-updated-automatically/about-dependabot-version-updates

Perhaps documenting these best practices more clearly would also help?

@tometzky
Copy link
Author

tometzky commented Feb 4, 2022

I think the willingness to do this might be related to how easy it is to use the preview version.

If there's an rc released:

python3 -m venv /tmp/venv
if [ "_$enrivonment_type" == "_dev" ]
then
  /tmp/venv/bin/pip install --upgrade --pre pip
else:
  /tmp/venv/bin/pip install --upgrade pip
fi

Versus "I'm developing in Python for 15 years, but I have no idea how to even start trying to install pip from main inside a venv".

Also just using "main" feels too fragile even for development environments. There's no guarantee that even automatic tests were run.

I believe that making a rc release does not have to be time consuming. I think, that using https://docs.github.com/en/rest/reference/releases it is possible to create it with just a simple script with 2 parameters: version_number commit_number.

@pradyunsg
Copy link
Member

Also just using "main" feels too fragile even for development environments. There's no guarantee that even automatic tests were run.

Really? Our policy is that main should always be in a releasable stage and we keep our CI green basically all the time (barring breakages that we've not had time to investigate).

Anyway, I'm fine with this as an idea. I think there's value here but someone has to do the work of communicating about this and actually setting up the broader culture/group of users who'll actually test things.

@pradyunsg pradyunsg added state: needs discussion This needs some more discussion type: maintenance Related to Development and Maintenance Processes and removed type: feature request Request for a new feature S: needs triage Issues/PRs that need to be triaged labels Feb 4, 2022
@pradyunsg pradyunsg changed the title Consider releasing rc version a few days before major relasing new minor/major version Consider doing release candidates Feb 4, 2022
@pradyunsg
Copy link
Member

pradyunsg commented Feb 4, 2022

I'm pretty sure that it's pip install git+https://github.com/pypa/pip for those wondering how to install from GitHub directly.

It's also documented under "VCS support" in our documentation.

@potiuk
Copy link
Contributor

potiuk commented Feb 4, 2022

Anyway, I'm fine with this as an idea. I think there's value here but someone has to do the work of communicating about this and actually setting up the broader culture/group of users who'll actually test things.

If you are ok for that, I am happy to help with the next bigger release you plan. I can also add some automation around it and gather the list of people/issues and prepare some communication templates.

@pfmoore
Copy link
Member

pfmoore commented Feb 4, 2022

I can also add some automation around it and gather the list of people/issues and prepare some communication templates.

Given that we're currently working almost entirely on assumptions and instinct at the moment, could you add into that some work to collect some metrics, to allow us to evaluate the benefits more objectively? Things like how many people actually did test, what type of users were they (for example, did we get any testers from the sorts of corporate infrastructure that were hit by the HTML5 issue), were any bugs filed/fixed as a result of the RC. Otherwise we could end up with extra process, but no certainty that we're gaining anything from it.

I'm still not convinced RCs will help, but I'm willing to be persuaded.

@potiuk
Copy link
Contributor

potiuk commented Feb 4, 2022

Given that we're currently working almost entirely on assumptions and instinct at the moment, could you add into that some work to collect some metrics, to allow us to evaluate the benefits more objectively? Things like how many people actually did test, what type of users were they (for example, did we get any testers from the sorts of corporate infrastructure that were hit by the HTML5 issue), were any bugs filed/fixed as a result of the RC. Otherwise we could end up with extra process, but no certainty that we're gaining anything from it.

I think some of that can be done (i.e. numbers of people and types of issues they tested for example) -> this might let us extrapolate on types of users that did test.

However if we want to keep the "low effort" of it - some metrics cannot be easily achieved. Basically anything that requires to provide structured information from the users that will be doing tests except "Tested, works" is defeating the purpose of low-friction from users and some of that "low-effort" from maintainers.

I am sure we can run it as "an experiment" and try to come up with some more metrics as we do it - but treating that as an experiment, we should likely start with something very simple and add more as we progress (possibly working out more metrics as we learn. We have no baseline currently so we will never know if things improved or not "for sure". But I am sure if we do few releases as an experiment we will be able to get some numbers (and better every time we do it).

That's the usual way I work in "scientifical" way: Improve -> Observe -> Measure loop repeated many times is far more efficient than trying to work out things upfront.

@potiuk
Copy link
Contributor

potiuk commented Feb 6, 2022

If there is a general consensus - I am happy to spend some of my time and produce an example issue/approach that could be used in the next release based on the past ones. Could be good thing to see if this is a workable first step.

@pradyunsg
Copy link
Member

pradyunsg commented Feb 7, 2022

Spending a few minutes to note stuff/thoughts pertaining to this topic:

Overall, I do feel that the solution here is not adding release-time processes (or automation, we have enough of that), but rather than making changes via the more gradual change management mechanisms we have already. We didn't use it for one change in this release -- and as far as I'm concerned, I believe that was the issue with 22.0.

That isn't to say that I think pre releases are a bad idea -- there's some version of this where a pre-release would've caught the main issues with 22.0. Rather, I don't think we'd get to that level of testing prevelance without significant effort, and I'm pretty sure we'd be slightly worse off if the effort to set this up is abandoned halfway. If someone's willing to help put down effort toward that, then, they're welcome to! I'm wary of asking someone to pick this up tho. :)

  • we have a release policy that explicitly notes how to handle beta releases and how they affect our release cadence (which is what I'd prefer to call these, instead of rcs).
  • it's not immediately clear what sort of communication is going to be useful for this, so that's likely where help would be most welcome.
  • I'm pretty confident that doing rcs is not going to be worthwhile, if it's no coupled with some amount of communication around it; and I'm not sure what the overhead introduced by that will be.

Notably, our experience with pip 10.0 is part of the hesitation here -- we had multiple betas, a bunch of announcements and a decent amount of communication around the release that we knew had a fairly significant change. It still hit a lot of issues because multiple projects did not care about that change, until we cut a stable release. It was still painful overall.

@potiuk
Copy link
Contributor

potiuk commented Feb 7, 2022

I understand the hesitation and bad experiences, but I think if we try it as an experiment, as you mentioned, there is not a lot of overhead on preparing betas (happy with the name, no problem).

I think a lot has changed since 10 version (2018):

  • People got more aware how important pip is in their pipeline (certainly we did in Airflow)

  • using Github as communication media is much more viable now than it was then (looking even at the activity int the Insights of pip project, seems that activity in GitHub is way higher aafter 2018 on average rather than it was then

image

  • I think also GitHub on its own got much more people who are also users, not casual contributors only - we have a huge base of issues people raised (And they had to create GitHub account just to raise an issue) and we might tap into those

I do not know exactly what you've done in 2018, but I hink also there is a difference here in the way I propose to reach out. I do not want to reach out "in general" "mailing lists" etc.. I want to reach to specific people - individually. Those who were involved in issues and PRs of pip and ask them personally to help testing beta version. In one issue, but individuallty mention every person who was involved.

What works in this case is building the "mutual expectations" on "individual" not "corporate" level.

This is pure psychological effect. If someone wanted something from pip (raised concern, raised issue, complained etc.) this is only "appropriate" to expect from them to help if the committers of pip ask for help. So that even justifies reaching out directly and is not considered as "cold call" or "spam".

This has also much more important effect that is reinforcing the exact behaviours you were complaining about (that they are lacking). I believe you complained that "users" are only complaining but they are not actually helping or paying money to keep things runnig, By specifically reaching out to those people "individually", pip maintainers are very clear and explicit about the expectations they have for their users. You expect them to help. If those individual people will not help, then they have far less "right" to "just complain" - because they explicitly failed your expectation, so they cannot expect theirs will be fulfilled. And if they will complain, then you have much more stronger "right" to say "You were asked individually to help but you did not - why do you complain now?". I bet next time they will help, just to have the "right" to complain.

And even people who will say "I do not want to be on that list any more" - those will have completely no rights to complain. They - voluntariliy - declared this way they have no interest in pip working flawlessly.

This is just psychology and trying to build the right "mutual" expectations and build a process where there is a psychological "self-reinforcing" effect. With every iteration, there will be more people who not only be more engaged but also will be painfuly aware of the consequences of their own inaction.

@potiuk
Copy link
Contributor

potiuk commented Feb 7, 2022

At the end - it's pip maintainers decision of course whether to try it, or not.
But if I get "ok to do experiment" I am happy to help. Up to you.

@potiuk
Copy link
Contributor

potiuk commented Feb 14, 2022

I calculated some stats based on what we have done in Airflow:

Issue Num Providers Num Issues Tested Issues Tested (%) Involved Commenting Involved who commented Extra people User response (%)
16456 67 72 28 38 48 21 19 2 39
17037 14 28 4 14 21 9 6 3 28
17268 31 65 29 44 36 13 12 1 33
17922 67 44 26 59 30 17 16 1 53
18028 3 5 2 40 3 2 1 1 33
18638 17 55 26 47 35 15 14 1 40
19328 56 53 27 50 48 18 16 2 33
19414 2 10 6 60 8 3 1 2 12
19895 14 43 20 46 31 12 11 1 35
20097 1 8 3 37 4 4 1 3 25
20220 1 10 9 90 5 2 1 1 20
20615 23 90 36 40 48 23 22 1 45
20766 2 3 0 0 3 1 1 0 33
21348 25 53 5 9 24 9 6 3 25
21443 27 73 43 58 33 17 14 3 42
Total 350 612 264 43 196 92 88 4 44

Some summary.

The stats cover period June 2021 - Feb 2022.

In total we had 15 releases, of 350 provider packages. All those provider packages solved 612 issues. Out of those 612 issues total 264 have been tested (which means our community tested 43% of all issues - features, bugfixes). The ratio "per-provider" tests was much higher (hard to say without some more "complex" logic).

There were 196 people involved, out of which 88 actively took part and commented their issues (either with "yes works" or "no it does not work"). Which is 44% "response rate".

There were 4 people who commented who were not involved (this basically means that in this 15 releases only 4(!) people commented on the issue when they were not directly mentioned in the issue).

I leave it here for considerations - this is the stats we have in Airflow, so it might not be relevant for pip community, but I think it's worth looking at it.

potiuk added a commit to potiuk/airflow that referenced this issue Jan 4, 2023
We've only supported to install `pip` from released packages with
a version number, but since `pip` does not support RC candidates
(as extensively discussed in pypa/pip#10882)
we cannot use the release versions to do that.

We still want to help `pip` maintainers and be able to test the versions
they release as early as possible, so we add support to install
`pip` in our toolchain from a GitHub URL.

That will allow us to test new `pip` version as soon as designated
branch of `pip` will contain something resembling a release candidate
ready for testing.
potiuk added a commit to apache/airflow that referenced this issue Jan 5, 2023
…28697)

We've only supported to install `pip` from released packages with
a version number, but since `pip` does not support RC candidates
(as extensively discussed in pypa/pip#10882)
we cannot use the release versions to do that.

We still want to help `pip` maintainers and be able to test the versions
they release as early as possible, so we add support to install
`pip` in our toolchain from a GitHub URL.

That will allow us to test new `pip` version as soon as designated
branch of `pip` will contain something resembling a release candidate
ready for testing.
ephraimbuddy pushed a commit to apache/airflow that referenced this issue Jan 12, 2023
…28697)

We've only supported to install `pip` from released packages with
a version number, but since `pip` does not support RC candidates
(as extensively discussed in pypa/pip#10882)
we cannot use the release versions to do that.

We still want to help `pip` maintainers and be able to test the versions
they release as early as possible, so we add support to install
`pip` in our toolchain from a GitHub URL.

That will allow us to test new `pip` version as soon as designated
branch of `pip` will contain something resembling a release candidate
ready for testing.

(cherry picked from commit 6cbf9b6)
kosteev pushed a commit to GoogleCloudPlatform/composer-airflow that referenced this issue Mar 31, 2023
…#28697)

We've only supported to install `pip` from released packages with
a version number, but since `pip` does not support RC candidates
(as extensively discussed in pypa/pip#10882)
we cannot use the release versions to do that.

We still want to help `pip` maintainers and be able to test the versions
they release as early as possible, so we add support to install
`pip` in our toolchain from a GitHub URL.

That will allow us to test new `pip` version as soon as designated
branch of `pip` will contain something resembling a release candidate
ready for testing.

GitOrigin-RevId: 6cbf9b62e374666693b93f79a36dcb46cc8245c4
kosteev pushed a commit to GoogleCloudPlatform/composer-airflow that referenced this issue Apr 4, 2023
…#28697)

We've only supported to install `pip` from released packages with
a version number, but since `pip` does not support RC candidates
(as extensively discussed in pypa/pip#10882)
we cannot use the release versions to do that.

We still want to help `pip` maintainers and be able to test the versions
they release as early as possible, so we add support to install
`pip` in our toolchain from a GitHub URL.

That will allow us to test new `pip` version as soon as designated
branch of `pip` will contain something resembling a release candidate
ready for testing.

(cherry picked from commit 6cbf9b62e374666693b93f79a36dcb46cc8245c4)

GitOrigin-RevId: 0df6a336a26ca858e04b9a74b0c9185fae38f5e3
ahidalgob pushed a commit to GoogleCloudPlatform/composer-airflow that referenced this issue Nov 7, 2023
…#28697)

We've only supported to install `pip` from released packages with
a version number, but since `pip` does not support RC candidates
(as extensively discussed in pypa/pip#10882)
we cannot use the release versions to do that.

We still want to help `pip` maintainers and be able to test the versions
they release as early as possible, so we add support to install
`pip` in our toolchain from a GitHub URL.

That will allow us to test new `pip` version as soon as designated
branch of `pip` will contain something resembling a release candidate
ready for testing.

GitOrigin-RevId: 6cbf9b62e374666693b93f79a36dcb46cc8245c4
kosteev pushed a commit to GoogleCloudPlatform/composer-airflow that referenced this issue Sep 19, 2024
…#28697)

We've only supported to install `pip` from released packages with
a version number, but since `pip` does not support RC candidates
(as extensively discussed in pypa/pip#10882)
we cannot use the release versions to do that.

We still want to help `pip` maintainers and be able to test the versions
they release as early as possible, so we add support to install
`pip` in our toolchain from a GitHub URL.

That will allow us to test new `pip` version as soon as designated
branch of `pip` will contain something resembling a release candidate
ready for testing.

GitOrigin-RevId: 6cbf9b62e374666693b93f79a36dcb46cc8245c4
kosteev pushed a commit to GoogleCloudPlatform/composer-airflow that referenced this issue Nov 8, 2024
…#28697)

We've only supported to install `pip` from released packages with
a version number, but since `pip` does not support RC candidates
(as extensively discussed in pypa/pip#10882)
we cannot use the release versions to do that.

We still want to help `pip` maintainers and be able to test the versions
they release as early as possible, so we add support to install
`pip` in our toolchain from a GitHub URL.

That will allow us to test new `pip` version as soon as designated
branch of `pip` will contain something resembling a release candidate
ready for testing.

GitOrigin-RevId: 6cbf9b62e374666693b93f79a36dcb46cc8245c4
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
state: needs discussion This needs some more discussion type: maintenance Related to Development and Maintenance Processes
Projects
None yet
Development

No branches or pull requests

6 participants