Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inquiry of increasing the job time limit #16

Closed
yuzc19 opened this issue Feb 13, 2024 · 2 comments
Closed

Inquiry of increasing the job time limit #16

yuzc19 opened this issue Feb 13, 2024 · 2 comments

Comments

@yuzc19
Copy link

yuzc19 commented Feb 13, 2024

Hi! Thanks for providing the HPC resources!

I would like to know if you can increase the job time limit from 24 to 96 hours because when I pre-train a model, it often takes more than one day to finish the whole experiment. Right now, I need to resume my previous training occasionally, which is really inconvenient. I would really appreciate it if you could increase the slurm job time limit. Thank you!

image

@koomie
Copy link
Collaborator

koomie commented Feb 13, 2024

Hi there. Can you expand on your restart process? Are you able to pick back up where the previous training stopped without incident? Is the inconvenience just related to submitting an additional job? To aid in that, SLURM supports job dependencies so you can submit multiple jobs up from and they will schedule to run sequentially, one after another. I

@yuzc19
Copy link
Author

yuzc19 commented Feb 13, 2024

Thank you! I think job dependencies fix the issue for me.

@yuzc19 yuzc19 closed this as completed Feb 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants