Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

migrate to ray tune #35

Closed
aniketmaurya opened this issue Aug 28, 2021 · 0 comments · Fixed by #36
Closed

migrate to ray tune #35

aniketmaurya opened this issue Aug 28, 2021 · 0 comments · Fixed by #36
Assignees
Projects
Milestone

Comments

@aniketmaurya
Copy link
Contributor

aniketmaurya commented Aug 28, 2021

Why Ray over Optuna?

Initially, I started with Optuna for HPO because of its Pythonic APIs and powerful search algorithms. Later I realized that Gradsflow will have to incorporate logic for distributed training but Ray already provides this out of the box.

Ray provides a simple, universal API for building distributed applications and it already supports multiple hyperparameter tuning libraries including Optuna.

  • With Ray, we get to leverage distributed training and HP search out of the box.
  • We get to use search algorithms not only by ray but also optuna and some other cool libraries.
  • Easy process and GPU management. Ray even supports gpu_fraction training.

NOTE: User API will remain the same and you won't feel any difference apart from Ray's cool distributed training features. 🔥

@aniketmaurya aniketmaurya self-assigned this Aug 28, 2021
@aniketmaurya aniketmaurya mentioned this issue Aug 29, 2021
10 tasks
@aniketmaurya aniketmaurya changed the title integrate ray tune migrate to ray tune Aug 29, 2021
@aniketmaurya aniketmaurya added this to In progress in Kanban Aug 29, 2021
@aniketmaurya aniketmaurya added this to the 0.0.3 milestone Aug 29, 2021
Kanban automation moved this from In progress to Done Aug 29, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Development

Successfully merging a pull request may close this issue.

1 participant