-
Notifications
You must be signed in to change notification settings - Fork 889
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cache #1829
Cache #1829
Conversation
The two other big slowdowns: (1) Switching to using pre-commit means that its environment needs to be initialized |
Removing xdist did not make a big difference because I controlled something like 20 runs in we más had 2 cores. But of course once I get rid of the spontaneous errors I will activate it again. I made another change that speed things up, nullmodem ! the tests (apart from a few) no longer uses the network but communicates through a couple of class variables Pre-commit is actually expensive in CI, but your idea with the env variable could be the solution. |
o_0 that's great!
What if we don't use it in CI? We can use something like |
That's another possibility, however I have no experience in using that bot...it needs to read the precommit yaml and then somehow update py project.toml I hope that soon precommit will be able to use toml so we can get rid of this double configuration. |
It's a great bot; if you'd like I can set it up. Here's an example of updating both files at the same time.
Unfortunately, the maintainer is an arrogant jerk that will never do this: pre-commit/pre-commit#1165. :( |
Also, we should be able to drop In fact it could be done today, if you are OK with some small formatting changes and using an alpha. |
I am all in favor of moving towards ruff, so please do.. If I remember right you were also closing in on pylint? Regarding the bot, I will have to read a bit more, I am pr nature sceptical with apps that run outside github, but it does look like a good candidate. An alternative is precommit.ci that seems to do a similar thing. I think github also have a bot that updates, I remember we use (or used) that in Homeassistant. |
OK, I will test the alpha. It has ~80 minor formatting changes compared to
Many of pylint's rules can be done by
Yeah, it's |
a2f5ee6
to
a76383e
Compare
@alexrudd2 I think you are trying to give me a heart attack !! I was sitting here testing caching, it worked find until I suddenly had 20 caches instead of 15....guess what "someone" merged a PR. Just joking, I think I have solved most of the caching issues, except for pre-commit. |
I propose dropping pre-commit from the CI and running the tools manually. The versions can be kept in sync between |
I will give it try, but it probably will be end of next week....I am off in a couple of hours, and will only be online a 1-2 hours each day until next Friday. |
@alexrudd2 just thinking loud! pre-commit really does not bring a lot to this project, another option is to:
And then suddenly we have all dependencies back in pyproject.toml. What is the disadvantage I have overlooked ? I mean for this project, with its "huge" developer population. |
pre-commit has minor value to me because it's easy to transfer between projects and install/uninstall. I tried using bare git hooks before and they were a bit fussy, but I'm OK using them if you can get them to work. |
5d67052
to
542f861
Compare
5363b1d
to
6ccdd01
Compare
This pull request sets up GitHub code scanning for this repository. Once the scans have completed and the checks have passed, the analysis results for this pull request branch will appear on this overview. Once you merge this pull request, the 'Security' tab will show more code scanning analysis results (for example, for the default branch). Depending on your configuration and choice of analysis tool, future pull requests will be annotated with code scanning analysis results. For more information about GitHub code scanning, check out the documentation. |
20f428d
to
2111822
Compare
with python in place, we can make our own cache key, and thereby only have:
3 (Windows, Ubuntu, MacOS) x 5 (python 3.8 - 313) = 15 cache.
We need to avoid making new caches for each pull request, that currently is the cause of why some jobs are slow in starting and that the real clock time is too high.
Current timing is at 15 minutes 2 seconds.