Skip to content

Comments

Proposing a means to embed learning rate finder (lr_find) before fitting multivariate models#1263

Closed
carusyte wants to merge 1 commit intoNixtla:mainfrom
carusyte:lr_find
Closed

Proposing a means to embed learning rate finder (lr_find) before fitting multivariate models#1263
carusyte wants to merge 1 commit intoNixtla:mainfrom
carusyte:lr_find

Conversation

@carusyte
Copy link

@carusyte carusyte commented Feb 4, 2025

idea: #1262

@elephaint
Copy link
Contributor

Thanks for your contribution;

Please follow our contributing guide; we use nbdev so direct changes to .py files are generally not allowed.

I like the idea as it seems simple and straightforward, can you provide a piece of code that showcases the functionality?

@marcopeix
Copy link
Contributor

The changes needed to implement this is a bit more involved than just this piece of code. I also think it doesn't make sense to have it part of the trainer_kwargs since it is technically not a trainer kwarg. I have a working implementation, but it will add another parameter to BaseModel, so we also need to add it to all models in neuralforecast.

I'll create another PR using this as a starting point.

@marcopeix
Copy link
Contributor

No need of this PR, we can do:

from pytorch_lightning.callbacks import LearningRateFinder
models = [NHITS(h=12, input_size=24, max_steps=10, learning_rate=1e-3, callbacks=[LearningRateFinder()])]

@marcopeix marcopeix closed this Apr 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants