Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Split CI pipelines #23385

Merged
merged 3 commits into from
Mar 10, 2023
Merged

Split CI pipelines #23385

merged 3 commits into from
Mar 10, 2023

Conversation

jolheiser
Copy link
Member

@jolheiser jolheiser commented Mar 9, 2023

  • This PR attempts to split our various DB tests into separate pipelines.
  • It splits up some of the extra feature-related tests rather than having most of them in the MySQL test.
  • It disables the race detector for some of the pipelines as well, as it can cause slower runs and is mostly redundant when the pipelines just swap DBs.
  • It builds without SQLite support for any of the non-SQLite pipelines.
  • It moves the e2e test to using SQLite rather than PG (partially because I moved the minio tests to PG and that mucked up the test config, and partially because it avoids another running service)
  • It splits up the go mod download task in the Makefile from the tool installation, as the tools are only needed in the compliance pipeline. (Arguably even some of the tools aren't needed there, but that could be a follow-up PR)
  • SQLite is now the only arm64 pipeline, moving PG back to amd64 which can leverage autoscaler

Should resolve #22010 - one thing that wasn't changed here but is mentioned in that issue, unit tests are needed in the same pipeline as an integration test in order to form a complete coverage report (at least as far as I could tell), so for now it remains in a pipeline with a DB integration test.

Please let me know if I've inadvertently changed something that was how it was on purpose.


I will say sometimes it's hard to pin down the average time, as a pipeline could be waiting for a runner for X minutes and that brings the total up by X minutes as well, but overall this does seem to be faster on average.

Signed-off-by: jolheiser <john.olheiser@gmail.com>
@jolheiser jolheiser added topic/build PR changes how Gitea is built, i.e. regarding Docker or the Makefile topic/gitea-actions related to the actions of Gitea labels Mar 9, 2023
Copy link
Member

@delvh delvh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Apart from my comments below, the CI seems to agree with you that you improved its performance.

.drone.yml Outdated Show resolved Hide resolved
.drone.yml Show resolved Hide resolved
@@ -1607,7 +1815,6 @@ platform:

steps:
- name: manifest-rootless
pull: always
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was set twice in the same step, see two lines below.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pull: always should only be set for the first occurence of an image in the whole file. @jolheiser can you check that this is still true?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can double-check, sure.
Since some of these can now run in parallel, it might be a bit off. I'll see what comes up.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah I guess for the sake of performance, we can skip double pull in parallel runs. The CI runs often enough to keep images up to date. Even better would be an outside job on the server that updates images once a day or so.

Copy link
Member

@silverwind silverwind Mar 9, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this whole image pulling business is something that drone or each server could do itself. Even if you make a cron pipeline that does the pulling, it's not guaranteed to execute on all daemons in a distributed setup. What I do on my drone machines is this in a system cron once per day:

docker images --format "{{.Repository}}:{{.Tag}}" | grep -v '<none>' | sort | xargs -L1 docker pull
docker system prune -f

Copy link
Member Author

@jolheiser jolheiser Mar 9, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure if this current PR would hit dockerhub rate limits as-is, but in general I think image pulling should be cleaned up.

I'm just not sure if it should be done before or after this PR.

Another possibility is to more fully qualify the tag used, such that we pin the version more specifically.
A pro would be needing to pull less frequently, a con being we would need to manually update CI more often to update image versions, and I'm not sure if all images would play nice with that.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think exact tags is really maintainable without automation that also continuously updates the versions as some images publish very frequently.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We use a combination of a static runner, and autoscaler that will create/destroy runners as needed.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We use a combination of a static runner, and autoscaler that will create/destroy runners as needed.

Ok, sounds like we are not really in control of the docker image storage in such a setup, right?

@GiteaBot GiteaBot added lgtm/need 2 This PR needs two approvals by maintainers to be considered for merging. labels Mar 9, 2023
Copy link
Member

@delvh delvh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The other images will probably be outdated too, but I don't think this should be the scope of this PR (including my earlier comment regarding the postgres version 😅 )

@GiteaBot GiteaBot added lgtm/need 1 This PR needs approval from one additional maintainer to be merged. and removed lgtm/need 2 This PR needs two approvals by maintainers to be considered for merging. labels Mar 9, 2023
@GiteaBot GiteaBot added lgtm/done This PR has enough approvals to get merged. There are no important open reservations anymore. and removed lgtm/need 1 This PR needs approval from one additional maintainer to be merged. labels Mar 10, 2023
@techknowlogick techknowlogick merged commit f92e0a4 into go-gitea:main Mar 10, 2023
zjjhot added a commit to zjjhot/gitea that referenced this pull request Mar 13, 2023
* giteaofficial/main:
  Scoped label display and documentation tweaks (go-gitea#23430)
  Deduplicate template code for label selection menu (go-gitea#23431)
  Show edit/close/delete button on organization wide repositories (go-gitea#23388)
  Sync the class change of Edit Column Button to JS code (go-gitea#23400)
  Preserve file size when creating attachments (go-gitea#23406)
  [skip ci] Updated translations via Crowdin
  Use buildkit for docker builds (go-gitea#23415)
  Refactor branch/tag selector dropdown (first step) (go-gitea#23394)
  [skip ci] Updated translations via Crowdin
  Hide target selector if tag exists when creating new release (go-gitea#23171)
  Parse external request id from request headers, and print it in access log (go-gitea#22906)
  Add missing tabs to org projects page (go-gitea#22705)
  Add user webhooks (go-gitea#21563)
  Handle OpenID discovery URL errors a little nicer when creating/editing sources (go-gitea#23397)
  Split CI pipelines (go-gitea#23385)
@jolheiser jolheiser deleted the pipeline-split branch March 16, 2023 21:47
@delvh delvh added the skip-changelog This PR is irrelevant for the (next) changelog, for example bug fixes for unreleased features. label Apr 3, 2023
@delvh delvh added this to the 1.20.0 milestone Apr 3, 2023
@go-gitea go-gitea locked and limited conversation to collaborators May 3, 2023
@lunny lunny removed the topic/gitea-actions related to the actions of Gitea label May 17, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
lgtm/done This PR has enough approvals to get merged. There are no important open reservations anymore. skip-changelog This PR is irrelevant for the (next) changelog, for example bug fixes for unreleased features. topic/build PR changes how Gitea is built, i.e. regarding Docker or the Makefile
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Split up testing pipelines
6 participants