Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Discussion: Alternatives for ASHAscheduler in Ray #30

Open
chrstnkgn opened this issue Dec 21, 2022 · 2 comments
Open

Discussion: Alternatives for ASHAscheduler in Ray #30

chrstnkgn opened this issue Dec 21, 2022 · 2 comments
Assignees
Labels
Discussion Extra attention is needed Features New feature or request RayTune About RayTune (Hyperparameter Tuning)

Comments

@chrstnkgn
Copy link
Collaborator

What

  • Find alternatives for ASHA scheduler
    • Possible integrations that I can think of now are:
    1. Apply early stop with certain number of patiences
    2. Find another scheduler that ray provides that better fits our need

Why

  • While developing and enhancing this project, we have discussed about the scheduler that terminates the training process multiple times and the main issue was that 'ASHA scheduler does not fit our purpose' as it is the algorithm that works well in the multi-processing environment.
  • If we have firmly decided to stick with ray tune, then we need to seek algorithms that better fulfill our need and terminate the process at the appropriate timing

How

  • There seem to be several options that we can consider according to the ray docs (https://docs.ray.io/en/latest/tune/api_docs/schedulers.html), but if any of them does not seem appropriate, then it might be better to just go with early stopping
  • I don't think that it is the part that I can decide alone, so I kindly ask you to freely discuss and provide various opinions here! 🙏
@chrstnkgn chrstnkgn added Features New feature or request Discussion Extra attention is needed labels Dec 21, 2022
@chrstnkgn chrstnkgn changed the title Enhancements: Alternatives for ASHAscheduler in Ray Discussion: Alternatives for ASHAscheduler in Ray Dec 21, 2022
@kdg1993
Copy link
Collaborator

kdg1993 commented Dec 22, 2022

Thanks for the clear statements about the important issue that we need to solve @chrstnkgn!
I also take this issue seriously as it relates to the running time

After trying to use several schedulers that are not ASHA and reading the docs, I concluded that it was very difficult to find an optimal grace period and reduction factor

In my personal point of view, understandability & simplicity are sometimes more important than complex SOTA things in some cases

If we cannot provide a persuasive grace period & reduction factor with ASHA or the other complex schedulers automatically, simple early stopping is much better in the case
In other words, since the simple early-stopping method is in line with the complex scheduler in terms of the necessity for the user to find optimal choices, the simpler the better

Additionally, according to the choices for the grace period and reduction factor,
ASHA can not guarantee a significant reduction in running time

@seoulsky-field seoulsky-field added the RayTune About RayTune (Hyperparameter Tuning) label Dec 30, 2022
@seoulsky-field seoulsky-field added this to the Dataset: CheXpert milestone Dec 30, 2022
@chrstnkgn
Copy link
Collaborator Author

Thank you for your valuable opinion @kdg1993 ! 🌟💯

I agree that sometimes implementing simpler settings could be better in terms of both performance and understanding of the users. Also, it is not a harm to add a simple early-stop code for any of us or the users who want to use the feature as it is a fundamental methodology for the reduction of computational cost. I will try implementing early-stop ASAP as we are trying to conduct our first experiment and any kind of process terminator is necessary. We can start with an early stop first and try applying other algorithms if needed. Again thank you for your opinion! It helped me a lot to think about the issue in different directions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Discussion Extra attention is needed Features New feature or request RayTune About RayTune (Hyperparameter Tuning)
Projects
None yet
Development

No branches or pull requests

5 participants