-
Notifications
You must be signed in to change notification settings - Fork 6.6k
Expose schedulers #80
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
@anton-l - could you take a look here? :-) |
|
The documentation is not available anymore as the PR was closed or merged. |
anton-l
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it's ok to expose them for now, thanks for the fix! Although I'd like to make the naming less confusing between these and the noise schedulers.
Or maybe just replace them with https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.OneCycleLR.html and others, since they've already caught up with the ones we have in 'transformers` and here in terms of flexibility.
|
Very much agree here with @anton-l, let's please make sure from the very beginning that learning rate schedulers are never confused with noise schedulers. |
* Expose schedulers * Update __init__.py Co-authored-by: Anton Lozhkov <anton@huggingface.co>
fix: fix possible OOM caused by taylorseer caches
Right now this gives
ModuleNotFoundError: No module named 'diffusers.optimization'error.Do we want to expose all schedulers or do we want to force people to use
get_schedulerto get them?