Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Productionize GitHub Action to update dependencies on schedule #188

Open
peterdemin opened this issue Jun 11, 2020 · 4 comments
Open

Productionize GitHub Action to update dependencies on schedule #188

peterdemin opened this issue Jun 11, 2020 · 4 comments
Labels

Comments

@peterdemin
Copy link
Owner

peterdemin commented Jun 11, 2020

One of the friction points in the adoption of pip-compile-multi, is the lack of support by @dependabot-bot 1, which automates regular updates of the lock files.

The requirements for an update system are:

  1. Run regularly without an explicit trigger from a developer.
  2. Allow locking process customization to support hashed dependencies.
  3. Open a GitHub pull request if any of the lock files changed.
  4. Be cost-free for open-source projects.

One way of implementing this is by using GitHub Actions.
PoC action definition: https://github.com/peterdemin/pip-compile-multi/blob/master/.github/workflows/pipcompilemulti.yml

Example update PR generated by the GitHub Action: #187

Known problems:

  • Pull request is opened on behalf of the user who provided the PAT token, which means that this person won't receive an email notification. The PR will still be shown on the Recent Activity 2 section though.
  • Old Pull requests are not automatically closed when the new ones are created. But it's a relatively low effort to manually delete the obsolete PRs.

CC @davidism

@davidism
Copy link

davidism commented Jun 20, 2020

I've been using Dependabot and pip-compile for a month now, and it's starting to show some issues. Some of it is fairly specific to Pallets, so I wouldn't expect it to be fixed by pip-compile-multi.

Dependabot generating one PR per package is really noisy, especially across 8 repos and counting that all have the same dev dependency pins. So I do like the idea of pip-compile-multi making a single PR for all upgrades, and having it run less often.

I don't think I want to add a workflow file to every repository, since I would have to keep any changes synchronized, although that might be the short term solution. In the long run, I'd like a bot that can update or merge an existing PR, it doesn't have to be as complex as Dependabot. I'm especially worried about not getting the notifications, since I will not remember to check every repo.

On the pip-compile side, there's two issues. Specific to Pallets, Jinja and Click have dev dependencies (Sphinx and pip-tools) that depend on them in turn, so released versions get pinned, and pip currently installs those over an editable local install. I have issues open with them, but am not confident in a fast resolution. Maybe pip-compile-multi could have some flag to remove a line after building, but that gets messy and it's not pcm's job.

The other issue is just that having separate requirements files for separate envs feels both very verbose and not descriptive enough. Three envs requires six files. But we actually have 5 envs (six if you separate maintainer vs contributor tools), Tox runs pip-compile and mypy as separate envs, and it seems silly to have separate .in files for each of those to avoid installing dev.txt. I think ultimately I want a tool that can maintain pins in tox, or multiple envs / input templates in a single config file.


All that said, I think the GitHub workflow you came up with is pretty cool, I don't want to discourage you from going forward with it. But I need to think more about what Pallets will do before using it.

@peterdemin
Copy link
Owner Author

Thanks for the thorough reply! The lack of notifications can be solved by creating a separate GitHub account, I think it can be shared multiple repos, if each repo creates a unique auth token. I need to look more into it.

I don't feel like having many .in + .txt (+ .hash) files is silly, as long as they are all inside the dedicated requirements/ directory. In order to combine them, PCM will need to introduce some kind of additional config structure, similar to PipEnv, and that contradicts the KISS principle. I've seen projects with 16+ files in requirements dir, and it looks okay given that their management is documented, automated, and verified.

For the keeping changes synchronized in all repositories problem, I think it's possible to extract the basic case into a marketplace GitHub Action, so it'll be referenced from each repo and updated in one place. Also, it might make sense to have palletes-specific shared Action, because of the auth token share issue.

@peterdemin peterdemin changed the title Automatically update dependencies on a fixed schedule Productionize GitHub Action to update dependencies on schedule Mar 20, 2024
@aebrahim
Copy link

We use peter-evans/create-pull-request to make the pull request - I think it nicely abstracts the problems away.

I think the part that would be most helpful for us is the ability to have pip-compile-multi also generate a helpful pull request description (e.g. pull in the changelog for the versions which are updated);

For reference, here is the workflow we use, released here under the MIT and Apache 2.0 licenses:

on:
  workflow_dispatch: null
  schedule:
    # Monday 10AM UTC
    - cron: "0 10 * * 1"
name: Update Python dependencies
jobs:
  pip_compile_multi:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-python@v5
        with:
          cache: "pip"
          python-version: "3.11"
      - run: pip install pip-compile-multi
      - run: pip-compile-multi -d <our_directory> --backtracking --autoresolve --header=<our_header.txt>
      - uses: peter-evans/create-pull-request@v6
        with:
          branch: create-pull-request/pip-compile-multi
          title: Update python dependencies.
          body: Run of `pip-compile-multi` to upgrade all python dependencies.
          delete-branch: true
          labels: |
            Dependencies
          commit-message: |
            Update python dependencies.

            Run of `pip-compile-multi` to upgrade all python dependencies.

@peterdemin
Copy link
Owner Author

That's a great suggestion! I'd love to incorporate that in the GitHub action.
It shouldn't be a part of pip-compile-multi Python package, though.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants