Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request: pip-compile support #2334

Closed
bullfest opened this issue Aug 1, 2018 · 57 comments · Fixed by #10377
Closed

Feature request: pip-compile support #2334

bullfest opened this issue Aug 1, 2018 · 57 comments · Fixed by #10377

Comments

@bullfest
Copy link

bullfest commented Aug 1, 2018

Cool project!

It would be nice with support for the pip-tools pip-compile command (basically creating lock files with pip dependencies from specification files)

Proposed solution
If it's possible a solution could be to simply allow defining arbitrary lock-file commands if containerization is good enough so that it doesn't pose a security issue (haven't looked at any of the code, so no idea of how the project currently works).

Otherwise something like

"pipCompile": {
  "enabled": true 
   "inFile": "requirements.in"
   "outFile": "requirements.txt"
}

would probably be a good configuration that runs the command
pip-compile [--outputfile <outFile>] [<inFile>].

@bullfest bullfest changed the title pip-compile support Feature request: pip-compile support Aug 1, 2018
@rarkins
Copy link
Collaborator

rarkins commented Aug 1, 2018

Interesting. This would probably correspond to our concept of “lock file maintenance” but in this case we consider requirements.in to be the “package file” while requirements.txt functions as the lock file. In such situations you’d also want to avoid Renovate updating the requirements.txt regardless because updating each to the latest version might not work

@rarkins rarkins added type:feature Feature (new functionality) needs-requirements priority-4-low Low priority, unlikely to be done unless it becomes important to more people labels Aug 1, 2018
@bullfest
Copy link
Author

bullfest commented Aug 1, 2018

Yes, but as requirements.in essentially is a requirements.txt but with a different name one could simply change the fileMatch field in pip_requirements to something like ["^requirements.in$"], or am I missing something?

Also reasonable default values for the conf-dict would probably be

"pipCompile": {
  "enabled": false 
   "inFile": ""
   "outFile": ""
}

Enabling the tool without specifying in/out files would then result in the tool being run without any arguments/flags.

@rarkins
Copy link
Collaborator

rarkins commented Aug 1, 2018

I think I’d define this as a new manager called pip_compile that has a default match for requirements.in instead of requirements.txt. Output file can be generated by replacing .in with .txt. More advanced renaming can be deferred.

Majority of the other logic would be calling functions in pip_requirements, which you’d probably leave disabled.

@craigds
Copy link

craigds commented Jan 19, 2020

(We're using dependabot but we'd like options, and renovate has come highly recommended...) We use pip-compile, and currently our only choice in this space (I think?) is Dependabot.

Output file can be generated by replacing .in with .txt. More advanced renaming can be deferred.

It's (much) more complicated than that - .txt is a lockfile in this scenario, it contains the whole dependency tree, not just the direct dependencies.

Essentially, you change the .in file as required, then run pip-compile over it to generate the .txt file. For a usefully limited PR for package X, you'd probably want to then filter the .txt file diff to limit the PR to just the dependencies of package X, because pip-compile will otherwise update everything at once.

A common pip-compile workflow is to have multiple pairs of files:

requirements/
    base.in
    base.txt
    local.in
    local.txt
    production.in
    production.txt
    test.in
    test.txt

... which have to be compiled in a specific order. production.in lists only production dependencies, and includes a line that says -c base.txt to depend on base.txt (not an .in file!)

So when updating a dependency mentioned in base.in, you'd want to first compile base.in to produce base.txt, and then compile production.in to produce production.txt.

pip-compile has recently added direct support for dependent requirements via the -c option:
https://github.com/jazzband/pip-tools#workflow-for-layered-requirements

This workflow is described in the first 1/3 of https://jamescooke.info/a-successful-pip-tools-workflow-for-managing-python-package-requirements.html

Our current addition to this for our tooling is the in.list file which just contains a list of .in files in the right compile order. We made that bit up; not sure what other orgs do.

@rarkins
Copy link
Collaborator

rarkins commented Jan 20, 2020

@craigds thank you for the detailed description.

For a usefully limited PR for package X, you'd probably want to then filter the .txt file diff to limit the PR to just the dependencies of package X, because pip-compile will otherwise update everything at once.

Could this be achieved using pip-compile --upgrade-package X==2.0.0?

Regarding ordering, I'm thinking that Renovate could determine the required ordering by parsing/understanding the -c lines at the time of extraction. It would then build a directed graph and update the files in the required order.

@rarkins
Copy link
Collaborator

rarkins commented Jan 20, 2020

It would be great if you or anyone in this thread could build up an example repo along the lines described:

requirements/
    base.in
    base.txt
    local.in
    local.txt
    production.in
    production.txt
    test.in
    test.txt

@craigds
Copy link

craigds commented Jan 20, 2020

Could this be achieved using pip-compile --upgrade-package X==2.0.0?

Looks like it, yes, though I haven't used that option myself.

Regarding ordering, I'm thinking that Renovate could determine the required ordering by parsing/understanding the -c lines at the time of extraction. It would then build a directed graph and update the files in the required order.

yes, that'd be amazing 👍 note that -r is common instead of -c; I think the difference is that -c allows packages to appear multiple times in different files whereas -r doesn't. But it should probably handle both to build the directed graph.

@twslade
Copy link

twslade commented Jan 30, 2020

It would be great if you or anyone in this thread could build up an example repo along the lines described:

@rarkins see https://github.com/twslade/renovatebot-pip-tools-example

In the directory structure you listed above, there is a base.txt which is usually not necessary since you can create the *.txt with a command like:

pip-compile --output-file local.txt base.in local.in

@craigds
Copy link

craigds commented Feb 3, 2020

In the directory structure you listed above, there is a base.txt which is usually not necessary since you can create the *.txt with a command like:
pip-compile --output-file local.txt base.in local.in

I suppose... The problem with that is:

  1. that doesn't allow the bot to easily figure out what the dependencies are. I guess you'd have to parse the comments at the top of the .txt files to figure out how it was invoked.
  2. It's possible to override that comment though, using CUSTOM_COMPILE_COMMAND=update-dependencies in the environment. We have our own wrapper script for dev use so we use this to make sure people do the right thing locally.
  3. Including .in files directly means you're not necessarily freezing the same versions between the various environments. If the index changed between multiple invocations of pip-compile, you'd get different versions. So we deliberately included base.txt rather than base.in as mentioned in the article I linked above

@craigds
Copy link

craigds commented Feb 3, 2020

But I guess it'd be great if renovatebot could handle both styles :)

@rarkins
Copy link
Collaborator

rarkins commented Feb 3, 2020

Thanks, I was thinking the same thing. If the ordering of execution matters then it's essential that there is a standardized way for the bot to be able to extract and determine that. The -r and -c options within files seemed to solve that nicely. Is there any reason why people who want this plus Renovate couldn't move to that approach?

@buko106

This comment has been minimized.

@karfau
Copy link
Contributor

karfau commented Aug 28, 2020

I think this issue should be labeled with python, right?
(So that people can find it when coming from https://docs.renovatebot.com/python/#future-work )

@rarkins rarkins added priority-3-medium Default priority, "should be done" but isn't prioritised ahead of others and removed priority-4-low Low priority, unlikely to be done unless it becomes important to more people labels Aug 28, 2020
@rarkins
Copy link
Collaborator

rarkins commented Aug 28, 2020

Hmm, we took out language labels for now to resize how many labels we had in the repo. Need to think whether to reintroduce them or remove that doc link

@craigds
Copy link

craigds commented Sep 20, 2020

Thanks, I was thinking the same thing. If the ordering of execution matters then it's essential that there is a standardized way for the bot to be able to extract and determine that. The -r and -c options within files seemed to solve that nicely. Is there any reason why people who want this plus Renovate couldn't move to that approach?

no reason, no. We used a list file just because it was trivial for us to implement, but renovate can be smarter than that.

rarkins added this to Done in Renovate on Jun 18

not sure how to interpret this; this ticket definitely seems not-done :)

@rarkins rarkins added the status:requirements Full requirements are not yet known, so implementation should not be started label Jan 12, 2021
@tata9001
Copy link

pip-compile is more and more popular in the current python world.

We are eager for this in renovate.

@rarkins
Copy link
Collaborator

rarkins commented May 24, 2021

Renovate has evolved a bit since this was originally requested, so I wanted to recap.

  1. I think by default we can have fileMatch look for requirements/*.in or requirements.in. Anything wider would introduce a lot of false positives
  2. Users can easily customize this by manually configuring fileMatch
  3. We could have our existing pip_requirements manager look if a matching requirements.in file exists and if so then skip extracting the requirements.txt. This will reduce the number of "wrong" pip_requirements hits out of the box
  4. For pip_compile, we'll treat the *.txt file as an "artifact" of the corresponding *.in file
  5. Initially at least, we'd assume a 1:1 mapping between file.in and file.txt and only extract files with matching name.txt
  6. We use -c and -r to determine the order of files
  7. If files/dependencies need to be upgraded together then it's up to the user to apply grouping themselves

@MrNaif2018
Copy link

Hello @rarkins! We also want pip-compile support in renovate to finally move from dependabot.
Just a few additional notes from our a bit more advanced workflow and a few questions about future implementation to ensure it works great from the beginning:

For pip_compile, we'll treat the *.txt file as an "artifact" of the corresponding *.in file

What if the input files are also named .txt, will it be possible to have both input and output files to be of .txt extension, or should we change that in our project?

Initially at least, we'd assume a 1:1 mapping between file.in and file.txt and only extract files with matching name.txt

Maybe some parameter like output_directory could be added, or some regex to transform input files to output files.
And also, python dependencies differ by version, so generating output files in a different version may break the app or the dependencies, or cause other unexpected effects. I guess it should be made possible to specify which python version to use.
Also pip-compile has option --generate-hashes, maybe it should be allowed to provide custom flags to the compile command
As a complex example, we use pip-compile to generate deterministic requirements for our build images.
We have input files in requirements folder, and the corresponding deterministic files are generated in the requirements/deterministic directory. We specify exact python version series (3.7 for now), and change it once in a while when upgrading the deployment docker images. It is for now generated by this script to avoid the issues with different python versions I mentioned above.

Our workflow is pretty complex, but I wanted to give an additional example for the implementation to be better. Let me know if I can help with moving this forward, I am not so good in JS/TS but can assist with Python-related questions. Thank you

@rarkins
Copy link
Collaborator

rarkins commented Jun 9, 2021

What if the input files are also named .txt, will it be possible to have both input and output files to be of .txt extension, or should we change that in our project?

How/where is the relationship between input.txt and output.txt defined within the repo? e.g. are these today completely arbitrary and you put them into proprietary build scripts, or is there a convention for how/where to define the mapping?

Update: Seems like you have a proprietary build script which is linked to.

I guess it should be made possible to specify which python version to use.

This is already done in Renovate e.g. for Poetry and Pipenv. Ideally there would be a convention within the repository for defining the required Python version, but failing that we do have the ability to configure it.

Also pip-compile has option --generate-hashes, maybe it should be allowed to provide custom flags to the compile command

Ideally this is also specified within the repository. However wouldn't it be possible to use the logic "if the existing output file has hashes, then use --generate-hashes when generating the updated output file"?

Our workflow is pretty complex, but I wanted to give an additional example for the implementation to be better. Let me know if I can help with moving this forward, I am not so good in JS/TS but can assist with Python-related questions.

The biggest barrier to starting this is the large number of edge cases and advanced uses listed in this issue. This discourages anyone from starting implementation because it gives the impression "don't even try this unless you've got a few weeks to think through all the cases listed here". If someone can define what a minimum viable implementation looks like then it would help overcome that barrier. I think if we had a base implementation which satisfied 80%+ of use cases then it would also make it easier to break down the remaining 10-20% of use cases into more "bite sized" chunks of functionality which can be implemented one by one.

By the way this also illustrates the downside of package managers which are high on flexibility and/or low on convention. Everyone does things a different ways, needs to have custom shell scripts, etc.

@craigds
Copy link

craigds commented Jun 24, 2021

It seems to me that pip-compile doesn't supply enough information to infer the relationships between the in and txt files, at least not with CUSTOM_COMPILE_COMMAND being used. The options seem to be:

a. support {x}.in --> {x}.txt only
b. Parse the comment in the .txt file to determine how pip-compile was invoked last time and invoke it similarly this time. Don't support CUSTOM_COMPILE_COMMAND usage
c. Parse the -r and -c options in the .in files to build up a graph. If no .in file is present, just use setup.py [0]
d. (a) but also provide config settings which allow a custom map between in and txt

e. (d), but stick it in the repo as a basic separate file (not in renovate-specific config)

I wonder if (e) might be of interest as a pull request to pip-tools. Then it might be more widely useful and also reduce the pip-compile specific code in renovate. I've commented at jazzband/pip-tools#1435 (comment) to that effect.

[0] (I haven't seen this in the wild, not sure of details here. Could we just require an .in file always?)

@rarkins
Copy link
Collaborator

rarkins commented Jun 25, 2021

Thank you for the analysis. I'm inexperienced with the majority of package manager tools we support - because we support so many - but have built up some rules of thumb and instincts over time as to what I think is best.

Nearly always it's best for package manager settings to be explicit and committed in the repo. For example with npm, it's better to have settings in .npmrc than flags in a Makefile or package.json script commands.

Complicating Renovate config with manager-specific settings which are duplicated from elsewhere is also undesirable and can lead to a lack of single source of truth.

Final rule of thumb is that some managers allow a lot of flexibility and such managers always have a subset of users who insist on their right to complicated flexibility without acknowledging that it makes automation tools difficult to impossible. In such cases we need to at least find a middle ground.

Here it seems that the embedded pip-compile command in the output file would work very well for us to reverse engineer the input(s), but it's blocked from working when the env variable is in use. Still, I'd like that to be our next implementation approach and work from there.

@ssbarnea
Copy link

Keep in mind that even setup.py is considered a valid source and that the real input file may be another file.

It would be delusional to believe that renovate can parse these reliably.

For example I use pip-compile to produce constraint.txt files, not real reqs (more limited syntax).

As I stated previously the only sustainable way to this is to use input file patterns that trigger user configured command that updates a set of (output) files. That logic applies not only for pip-compile but for other deps too. Yep, running generic commands can raise some security concerns but in the end I am sure that most of the tools can easily hacked to do inject deliberate commands anyway. As long these run in isolated environments (containers) it should not be an issue. Using tox, make or even a bash script is not uncommon.

I am glad I found this bug before starting to effectively use renovate in production.

@rarkins rarkins added the help wanted Help is needed or welcomed on this issue label Jun 25, 2021
@rarkins
Copy link
Collaborator

rarkins commented Jun 25, 2021

The future of dependency management is explicit, declarative configuration files - ideally static. Systems so complex that people feel the need to warn of their complexity or lack of parseability will be marginalized over time by other tools which separate out the static declaration of dependencies from other functionality which requires scripting and interpretation. PEP621 is a good sign for Python in this regard.

I'm confident we can have a good solution for pip-compile in the meantime but it will likely require a certain level of convention and not satisfy every edge case or hardliner who's not willing to accept any changes towards improved declarability in-repo.

@terencehonles
Copy link

I definitely agree with your sentiments about .npmrc and having pip-tools have configurations live in a file sounds good when someone has a chance to work on it.

To be clear, the reason I want the header is that my environment is packaged in a container. So updating the requirements file is not sufficient. Obviously if push comes to shove I can just let people know X is how you update the file(s), but you still need to run Y in order to rebuild your container to pull in the requirements you just adjusted. Aside from updating the container everything is simple pip commands so if that's what ends up showing in the file that's not that big of a deal.

@karfau
Copy link
Contributor

karfau commented Jun 26, 2021

@rarkins Is there already some documentation regarding the current level of support?
I expected it to be listed under https://docs.renovatebot.com/python/ but that's not the case.
(I would like to spread/share the good news, but I don't want to point people to this issue that is now marked as open again...)

@rarkins
Copy link
Collaborator

rarkins commented Jun 26, 2021

@karfau
Copy link
Contributor

karfau commented Jun 27, 2021

Ah, thank you.

PS: It still looks as if the Python page could need an update (maybe it could just link to the supported managers).

@MrNaif2018
Copy link

MrNaif2018 commented Jun 29, 2021

Hi @rarkins! pip-tools 6.2.0 has been released and it has included deterministic ordering of packages (to match pip-freeze ones) and writing python version to the header comment. So it now looks like so:

#
# This file is autogenerated by pip-compile with python 3.7
# To update, run:
#
#    pip-compile --allow-unsafe --generate-hashes --output-file=requirements/deterministic/web.txt requirements/web.txt
#

It should be good enough to parse and extract python version to use.
About the implementation: is someone working on it or should I try?
I didn't fully get which idea was decided to be implemented: my idea with input/output conversion and reuse of existing managers a bit (easy to implement, no need to rely on pip-tools data format, but less "magic" involved, more configuration needed), or your idea with parsing pip-compile output (more "magic", should work out of the box in most cases, but relies on maybe changing format and might break), or some other idea?
My idea once again was approximately:

pip-compile manager settings could be left the same, just with that ability to configure input/output mapping in a better way, for now it is fine.
When enabled, fileMatch matches package files (.in) files, and gets output files via the transformation regex or similar (should be extensively configurable, I'm thinking of sed-like syntax).
The output files are registered with pip-compile manager, with renovate running pip-compile -o output_file input_file --upgrade package somepackage==someversion.
Note that registering output files would not be based on existing pip_requirements manager, no constraints to be matched at all, just a transformation and then applying pip_compile manager. As for the input files, they will be picked up by pip_requirements manager (probably it should extend the list of already matched files with those pip-compile input ones), and they will be processed just like normal python requirements files (so if no version constraint at all in a file, they will be skipped ONLY for pip_requirements operations).
The only thing is, some flag could be added somewhere, and when pip_requirements has finished updating input file, if it is an input file for pip-compile, it would then just run pip-compile -o output-file input-file to re-generate the output file based on new strict constraints.
Additionally, I think as pointed above it should be allowed to extend pip-compile args completely, like passing an array of args.
Optionally renovate could detect some things like --generate-hashes by parsing the files, otherwise the user could manually add it.

@rarkins
Copy link
Collaborator

rarkins commented Jul 27, 2021

Hi, could you describe more about:

deterministic ordering of packages (to match pip-freeze ones)

Is it contained within the # pip-compile ...... line or something else?

I had preferred that we don't need unnecessary or duplicated config in Renovate if we can avoid it. Hence I was hoping that the fileMatch patterns can be pointed to the output files and then the # pip-compile ... lines can be used to locate the input files, without need for config. Otherwise I think it's hard for us to capture/duplicate the full pip-compile command and ordering in config, in addition to being undesirable. Is there anything which would stop this from working?

@MrNaif2018
Copy link

Hi!

deterministic ordering of packages (to match pip-freeze ones)

I mean the pip-compile output itself is sorted by how pip freeze would output it (case-insensitive sort on package names as far as I know)
The ordering of packages of course doesn't need to be configured. What I mean is, let's say there is the following output file:

a==1.0.1
b==1.0.0

then if we do upgrade b to 2.0.0, without that fix it would then do:

b==2.0.0
a==1.0.1

With this change it will leave the order of dependencies in output files the same

@MrNaif2018
Copy link

Update on this: I plan to work on making it work better in the following weeks (sorry, no exact time range, if someone can pick it up and implement it I will help during review and testing).
I want to implement it this way, let me know if I'm missing some edge cases:

I would add a new config option to the pip-compile manager: something like outputPattern. It could be a string in sed-like syntax, i.e. s/in/txt. Or it could be a dictionary with 2 keys: searchPattern and replacePattern doing the same thing.
File pattern should probably be left unconfigured, or configured to find .in files by default (as recommended by pip-compile docs, but not always used this way). Maybe some logic could be used to hint the user that we support pip-compile, but I guess searching for pip-compile header across all files would make API limits run out faster, so just if we document it that's fine.
So the manager finds file by fileMatch just as others, but it combines two managers in one. The input files which are found by fileMatch are registered with pip-requirements manager with a special hook to call pip-compile each time input files change.
By using sed expression got from config, which can still default to
.in/.txt replacement, we get output files. So then if we need to update a dependency we run:
pip-compile input-file -o output-file --upgrade-package packagename==x.y.z
If we do lockfile refresh, we call
pip-compile input-file -o output-file --upgrade

That way we also don't need to change existing pip-requirements behaviour much: files without any version constraints can be left unregistered with pip-requirements manager (as it makes no sense to re-compile requirements unless we do lockfile refresh)
But I also think some manager param like extraArgs could be added either as a string or a list of arguments (not sure about shlex/shell splitting in node.js) to support some arguments often passed to pip-compile
Or at least we could search output files by regex to find --hash argument or others, then we enable hash generation (this one is one of the most useful options for security also)

Wasn't there a use case where pip-compile can take multiple inputs?

Yes there is, but I think it can be solved without complications.
We can test the replacement regex with different values (all matched files), and group input files by output files. Because when having multiple input files we still have one output file, usually uniting all dependencies from all files into one. So we take all input files by fileMatch, apply their transformation patterns and get output files.
In case of, let's say, a regex which just changes .in to .txt, for N input files we would have N output groups with 1 input file in each. If people have multiple input files, then their output file pattern is predictable and often constant (i.e. dev.in, test.in, prod.in files compiled into requirements.txt). We would then have one output group with all N input files matching. Or it can be multiple output groups if pattern is special.

@m1n9o
Copy link

m1n9o commented Nov 17, 2021

Any progress on this?

@paveldedik
Copy link

Maybe someone finds this useful, but the workaround for now we did in our company:

renovate.json

"packageRules": [
        {
            "matchManagers": ["pip_requirements"],
            "postUpgradeTasks": {
                "commands": [
                    "cd $(dirname {{{packageFile}}}) && pip-compile --upgrade-package={{{depName}}}=={{{newVersion}}} $(basename {{{packageFile}}} .txt).in"
                ],
                "fileFilters": ["**/requirements*.txt"]
            }
        }
    ]

config.js

module.exports = {
  allowPostUpgradeCommandTemplating: true,
  allowedPostUpgradeCommands: ['^pip-compile', '^cd'],
}

@m1n9o
Copy link

m1n9o commented May 9, 2022

Maybe someone finds this useful, but the workaround for now we did in our company:

renovate.json

"packageRules": [
        {
            "matchManagers": ["pip_requirements"],
            "postUpgradeTasks": {
                "commands": [
                    "cd $(dirname {{{packageFile}}}) && pip-compile --upgrade-package={{{depName}}}=={{{newVersion}}} $(basename {{{packageFile}}} .txt).in"
                ],
                "fileFilters": ["**/requirements*.txt"]
            }
        }
    ]

config.js

module.exports = {
  allowPostUpgradeCommandTemplating: true,
  allowedPostUpgradeCommands: ['^pip-compile', '^cd'],
}

Good one, I am using docker instead, since the version of Python on renovate is 3.10.
The pain point is that I have to use root privilege to run renovate for docker-in-docker.

@viceice
Copy link
Member

viceice commented May 9, 2022

Good one, I am using docker instead, since the version of Python on renovate is 3.10. The pain point is that I have to use root privilege to run renovate for docker-in-docker.

You don't need to run renovate as root for DinD, our helm chart works fine without root
https://github.com/renovatebot/helm-charts/tree/main/charts/renovate

@MaxWinterstein
Copy link

MaxWinterstein commented Sep 8, 2022

Maybe someone finds this useful, but the workaround for now we did in our company:

renovate.json

"packageRules": [
        {
            "matchManagers": ["pip_requirements"],
            "postUpgradeTasks": {
                "commands": [
                    "cd $(dirname {{{packageFile}}}) && pip-compile --upgrade-package={{{depName}}}=={{{newVersion}}} $(basename {{{packageFile}}} .txt).in"
                ],
                "fileFilters": ["**/requirements*.txt"]
            }
        }
    ]

config.js

module.exports = {
  allowPostUpgradeCommandTemplating: true,
  allowedPostUpgradeCommands: ['^pip-compile', '^cd'],
}

This - in fact - is a nice approach but sadly limited at some point, e.g. when you need to pin a requirement to a specific version.

We use pip-compile-multi and multiple .in files. At some point we pinned versions with known issues or when major refactoring would be needed.

Using the pip_requirements manager does not respect those pinned restrictions at all, as it just reads the requirements.txt files.

As we fight with a lot dependencies we tend to bulk upgrade packages, so instead of passing each update individually to pip-compile we can speedup things multiple times by only running it once time at the end:

                "commands": [
                    "pip install pip-compile-multi && cd backend && bash update_requirements_txt.sh {{#each upgrades}}--upgrade-package={{{depName}}}=={{{newVersion}}} {{/each}}"
                ],

(We use some wrapper script around pip-compile-multi but usage is more or less the same.)

Package upgrades passed to pip-compile-multi are processed, regardless if they are not allowed by some pinned version in the .in file. We just filter them out inside our wrapper script, by some grep.

This leads to pull requests that contain package upgrades that either are not allowed and therefore not in the real commit and maybe - did not verify yet - to incompatible package constellations as they might not be parsed against the pinned versions.

So for the moment, I am pretty stuck.

@HonkingGoose HonkingGoose added status:requirements Full requirements are not yet known, so implementation should not be started and removed status:in-progress Someone is working on implementation labels Jul 19, 2023
@rarkins rarkins removed type:feature Feature (new functionality) help wanted Help is needed or welcomed on this issue priority-3-medium Default priority, "should be done" but isn't prioritised ahead of others status:requirements Full requirements are not yet known, so implementation should not be started labels Oct 1, 2023
@renovatebot renovatebot locked and limited conversation to collaborators Oct 1, 2023
@rarkins rarkins converted this issue into discussion #24725 Oct 1, 2023

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.