New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Switch to hatchling & pip-tools #22
Conversation
Codecov Report
@@ Coverage Diff @@
## main #22 +/- ##
=========================================
Coverage 100.00% 100.00%
=========================================
Files 1 1
Lines 61 61
Branches 5 5
=========================================
Hits 61 61 📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
A lot of changes for 1 PR 😉
I don't really think locking requirements for libraries is necessarily a good idea. Especially with |
Tangential but since we're discussion CI setups / packaging / releases: I've found my favorite release strategy is to require every PR to make a valid version bump and cut a release for every commit to main. So instead of the flow being |
Sorry about that, I thought it would be okay since none of the changes relate to the library code itself.
We're not locking requirements for the library, only testing and linting requirements. I'm so bored of CI suddenly failing due to "black vs click" compatibility or "flake8 vs pycodestyle", that I've started doing this. There's not actually much more complexity, just @Zac-HD do you have an opinion on this?
Can do if you really want, sounds to me like it would produce too much noice and also make it hard to track down what code was in what release. But can if you want. Again probably worth asking @Zac-HD? |
Yeah it's frustrating but I'd rather that surface in CI than for devs. But that's just me, I don't have as many high profile projects as you 😅. But not a biggie keeping the lockfiles is fine for me.
I wasn't necessarily proposing it for this project, just general discussion. I'm not sure who it would produce too much noise for, I guess just folks that subscribed to GitHub release notifications. Tracking down which version of the code was running is actually easier because there is a 1:1 relationship between library version and git commit. It's also easier to do "last version that didn't break me / have a bug" sorta searches with git bisect. |
requirements/linting.txt
Outdated
# This file is autogenerated by pip-compile with python 3.10 | ||
# To update, run: | ||
# | ||
# pip-compile --output-file=requirements/linting.txt requirements/linting.in |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@samuelcolvin JFYK I recommend using the --resolver backtracking feature of pip-compile
because it's much more robust at solving complex dependency graphs (boto3 and tensorflow send their regards) and because it's the default resolve in pip now (so using it keeps the results of pip-compile more in line with what you'd get from pip install).
I'm strongly in favour of pinning all the CI dependencies. I also like the "autorelease ever PR" style - we do that for Hypothesis and it's great. No strong opinion on how to lock, though I personally use pip-compile everywhere because it's dead simple and has been long-term stable. For smaller projects I usually just set them to 'try twine upload --skip-existing' on every PR; the result is that you release whenever you bump the version number but don't have to release every PR. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM on a quick skim.
If we're using Black and Isort, want to try shed? It's my zero-config combination of those, pyupgrade, and a few smaller things.
Same with shed. I'll add |
Done, ready AFAIK. |
poetry was annoying me, also it's lock file was out of date.
Changes here:
pip-compile
(from pip-tools) to lock linting and testing requirements