Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Seems that some tests for FS7 are failing because they hit the wall time? #78

Closed
Remi-Gau opened this issue Sep 1, 2023 · 14 comments · Fixed by #79
Closed

Seems that some tests for FS7 are failing because they hit the wall time? #78

Remi-Gau opened this issue Sep 1, 2023 · 14 comments · Fixed by #79

Comments

@Remi-Gau
Copy link
Contributor

Remi-Gau commented Sep 1, 2023

https://app.circleci.com/pipelines/github/bids-apps/freesurfer/111/workflows/00c61725-eefe-4eb6-a748-c4b3586ce2fe/jobs/445

@Shotgunosine
Copy link
Contributor

Persisting the image is taking too long it looks like. Might need to see if we can streamline the docker file.

@PeerHerholz
Copy link
Collaborator

Looks like the image builds but the workspace for the tests runs out. CI suggests to use parallelism for tests. I have never done that, anyone has an idea how to implement this?

@Remi-Gau
Copy link
Contributor Author

Remi-Gau commented Sep 7, 2023

other option: we do not persist anything to the workspace and just only ever get stuff from the cache in the follow up steps?

@Shotgunosine
Copy link
Contributor

Sorry, it's building the image that's taking 42 minutes. I'll try a large or x-large machine to see if that speeds things up. If that doesn't work. We could download freesurfer in a previous step and persist it, then copy it into the container instead of downloading it. That might speed up the bulid, but it'd require some editing of the dockerfiles.

@Remi-Gau
Copy link
Contributor Author

Remi-Gau commented Sep 7, 2023

we may want to check if switching to github action should be considered.

@Shotgunosine
Copy link
Contributor

Shotgunosine commented Sep 7, 2023

It's not clear to me why the Freesurfer 7 build is now taking half an hour. It was taking ~16 minutes on previous commits and I don't see any changes in the dockerfile in the mean time. @Remi-Gau switching over to github actions would be one option, but I don't have time to do an overhaul like that right now.

@PeerHerholz, I don't think the length of time it's taking to build is something that parallelism is likely to solve.

It looks like the large machine might just barely sneak through. If that fails I'll try an x-large machine and see if it goes any faster, but I think we might need to cache the freesurfer tar if we want to get a more reliable build time.

@Remi-Gau
Copy link
Contributor Author

Remi-Gau commented Sep 7, 2023

Something is indeed strange.

2 circle CI runs on the same commit:

@Shotgunosine
Copy link
Contributor

Shotgunosine commented Sep 7, 2023 via email

@Shotgunosine
Copy link
Contributor

I've fixed the build timeouts for now by skipping the cacheing step in #77.

The test timeouts are going to be a bit trickier and should be saved for a separate PR.

@Remi-Gau
Copy link
Contributor Author

Remi-Gau commented Sep 7, 2023

@Shotgunosine
If you are not going to work on it right now, I may PR something to make a push to main triggers a push of "unstable" tag version to docker hub.

See an example of another app here:
https://hub.docker.com/r/bids/antscorticalthickness/tags

This allows users to try the "bleeding" edge of the app.

@Shotgunosine
Copy link
Contributor

Does this not already do that?

: "Pushing to DockerHub ${user_name}/${repo_name}:unstable"
docker tag "${user_name}/${repo_name}" "${user_name}/${repo_name}:${FS_VERSION}-unstable"
docker push "${user_name}/${repo_name}:${FS_VERSION}-unstable"

@Remi-Gau
Copy link
Contributor Author

Remi-Gau commented Sep 7, 2023

ah yes sorry I missed that

But it seems that there is no deploy planned on this push to master

https://app.circleci.com/pipelines/github/bids-apps/freesurfer/118/workflows/3bda51f6-b0c5-4a55-b054-0fa773c0d33e

@Remi-Gau
Copy link
Contributor Author

Remi-Gau commented Sep 7, 2023

I suspect it is because we are ignoring deploys on all branches:

ignore: /.*/

and we only do it on tags

@Shotgunosine
Copy link
Contributor

Shotgunosine commented Sep 7, 2023 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants