New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ENH] Add optimizer param for neuralforecast models #6235
Conversation
Hey, it'll ignore them. We can add a warning for that case. |
@fkiraly the tests are failing due to addition of new dependanacy which is torch. NeuralForecast has added new feature to support custom optimiser to neuralforecast models which have to be from class torch.optim.Optimiser. I thought that neuralforecast has the dependancy on torch and torch will be downloaded along the way but the tests fail with module not found error. Should I add torch to python-dependancies list in the tags of forecaster to pass the tests ?? |
To fix, you should move the import elsewhere, and remove the top level type hint. Please take a look at how I handled loss specification. Also, you must raise the upper bound on |
So basically we need to handle the model parameters for the latest version as well as version |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please take a look at #5112 and related changes. In _GeneralisedStatsForecastAdapter
, we have a method to check user provided arguments through sktime and a method to identify supported arguments based on installed statsforecast version. Only the supported ones are used, and the rest is ignored with an warning. We need to do the similar thing here as well.
Further more, your current changes are not tested in CI. You need to update pyproject.toml
. Currently it uses 'neuralforecast<1.7.0,>=1.6.4; python_version < "3.11"'
, and we should modify it to support latest release. Assuming neuralforecast
follows semantic versioning, we can update it to 'neuralforecast<1.8.0,>=1.6.4; python_version < "3.11"'
. The python version bound will remain as it is, as neuralforecast
do not support >3.10
officially yet.
Please let me know if you have any doubts/questions.
A small comment: you don't need to add test for every single addition of new argument. Of course it adds more coverage, but the it's not scalable in long run with more and more arguments supported by underlying packages. We trust they have their own unit tests so we will rely on that. This is my personal opinion by the way, may be @fkiraly, @benHeid can comment if they agree or not. Either way, you do not have to remove your existing codes, it's just a comment for future additions.
In-principle yes - we should make sure though that new parameters are covered, so we notice when they are removed. My method of choice would be via |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed the changes suggested replaced the logging.warn with sktime util. Will wait for #6237 to get merged, fix the merge conflicts and get this merged. |
@pranavvp16 #6237 is merged now, let's resume this PR once again. After this is merged as well, I'll resume #6124 . |
b33e822
to
fe8147d
Compare
be3ccd3
to
1e558e9
Compare
I see a lot of jobs were cancelled, do you know why? Can you please rebase once from main (if not done already), and trigger the tests again? |
That might have been me, I cancelled the PRs prior to the fix of the failures on |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I added few minor comments, please address those and we can merge this PR then today before tomorrow's scheduled full run (UTC 12 on Sunday) on main
.
Other than the below concepts, I've no more crucial feedbacks, except few very minor changes. Those can be done in a separate PR afterwards. Unless @fkiraly or @jmoralez have any other concerns, I'll be happy to merge after the following changes.
74e3ff8
to
3240d0f
Compare
As discussed on discord I have updated the neuralforecast models to support the optimizer parameter. Just a question to @jmoralez if I don't pass the `optimizer` and pass the `optimizer_kwargs` will it overide the kwargs of default optimizer or just ignore the kwargs, as it doesn't raise error when `optimizer_kwargs` are passed without `optimizer`. --------- Co-authored-by: Franz Király <f.kiraly@ucl.ac.uk>
As discussed on discord I have updated the neuralforecast models to support the optimizer parameter. Just a question to @jmoralez if I don't pass the
optimizer
and pass theoptimizer_kwargs
will it overide the kwargs of default optimizer or just ignore the kwargs, as it doesn't raise error whenoptimizer_kwargs
are passed withoutoptimizer
.