Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ENH] Add optimizer param for neuralforecast models #6235

Merged
merged 13 commits into from Apr 14, 2024

Conversation

pranavvp16
Copy link
Contributor

As discussed on discord I have updated the neuralforecast models to support the optimizer parameter. Just a question to @jmoralez if I don't pass the optimizer and pass the optimizer_kwargs will it overide the kwargs of default optimizer or just ignore the kwargs, as it doesn't raise error when optimizer_kwargs are passed without optimizer.

@jmoralez
Copy link

Hey, it'll ignore them. We can add a warning for that case.

@pranavvp16
Copy link
Contributor Author

@fkiraly the tests are failing due to addition of new dependanacy which is torch. NeuralForecast has added new feature to support custom optimiser to neuralforecast models which have to be from class torch.optim.Optimiser. I thought that neuralforecast has the dependancy on torch and torch will be downloaded along the way but the tests fail with module not found error. Should I add torch to python-dependancies list in the tags of forecaster to pass the tests ??

@yarnabrina
Copy link
Collaborator

torch was there already in dl dependency where neuralforecast is there. It's failing now as you are trying to import at the module level, which expects it to be present always which is neither the case nor desirable.

To fix, you should move the import elsewhere, and remove the top level type hint. Please take a look at how I handled loss specification.

Also, you must raise the upper bound on neuralforecast for CI to pass, otherwise I think it's going to fail. We should not upgrade lower bound though, so you need to handle for the cases where users pass arguments that are only present in higher versions. Please take a look at statsforecast adapter where @arnaujc91 added such adaptations.

@pranavvp16
Copy link
Contributor Author

pranavvp16 commented Mar 30, 2024

So basically we need to handle the model parameters for the latest version as well as version 1.6.4 right ?? @yarnabrina. Something like this neuralforecast>=1.6.4,<=1.7.0

@fkiraly fkiraly added module:forecasting forecasting module: forecasting, incl probabilistic and hierarchical forecasting enhancement Adding new functionality labels Mar 30, 2024
Copy link
Collaborator

@yarnabrina yarnabrina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please take a look at #5112 and related changes. In _GeneralisedStatsForecastAdapter, we have a method to check user provided arguments through sktime and a method to identify supported arguments based on installed statsforecast version. Only the supported ones are used, and the rest is ignored with an warning. We need to do the similar thing here as well.

Further more, your current changes are not tested in CI. You need to update pyproject.toml. Currently it uses 'neuralforecast<1.7.0,>=1.6.4; python_version < "3.11"', and we should modify it to support latest release. Assuming neuralforecast follows semantic versioning, we can update it to 'neuralforecast<1.8.0,>=1.6.4; python_version < "3.11"'. The python version bound will remain as it is, as neuralforecast do not support >3.10 officially yet.

Please let me know if you have any doubts/questions.


A small comment: you don't need to add test for every single addition of new argument. Of course it adds more coverage, but the it's not scalable in long run with more and more arguments supported by underlying packages. We trust they have their own unit tests so we will rely on that. This is my personal opinion by the way, may be @fkiraly, @benHeid can comment if they agree or not. Either way, you do not have to remove your existing codes, it's just a comment for future additions.

sktime/forecasting/neuralforecast.py Outdated Show resolved Hide resolved
sktime/forecasting/neuralforecast.py Outdated Show resolved Hide resolved
sktime/forecasting/neuralforecast.py Show resolved Hide resolved
sktime/forecasting/neuralforecast.py Show resolved Hide resolved
@fkiraly fkiraly changed the title [ENH]Add optimizer param for neuralforecast models [ENH] Add optimizer param for neuralforecast models Mar 31, 2024
@fkiraly
Copy link
Collaborator

fkiraly commented Mar 31, 2024

This is my personal opinion by the way, may be @fkiraly, @benHeid can comment if they agree or not.

In-principle yes - we should make sure though that new parameters are covered, so we notice when they are removed. My method of choice would be via get_test_params, but I don't know enough of the details to guess how robust this is for this specific case.

Copy link
Collaborator

@yarnabrina yarnabrina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mdae a few comments, please take a look.

I will also consider this PR to be dependent of the PR #6237 by @geetu040 , as that fixes a bug in the adapter introduced by new release, and then this PR adds new enhancements.

sktime/forecasting/base/adapters/_neuralforecast.py Outdated Show resolved Hide resolved
sktime/forecasting/base/adapters/_neuralforecast.py Outdated Show resolved Hide resolved
@pranavvp16
Copy link
Contributor Author

Fixed the changes suggested replaced the logging.warn with sktime util. Will wait for #6237 to get merged, fix the merge conflicts and get this merged.

@yarnabrina
Copy link
Collaborator

@pranavvp16 #6237 is merged now, let's resume this PR once again.

After this is merged as well, I'll resume #6124 .

sktime/forecasting/base/adapters/_neuralforecast.py Outdated Show resolved Hide resolved
sktime/forecasting/base/adapters/_neuralforecast.py Outdated Show resolved Hide resolved
sktime/forecasting/base/adapters/_neuralforecast.py Outdated Show resolved Hide resolved
sktime/forecasting/base/adapters/_neuralforecast.py Outdated Show resolved Hide resolved
sktime/forecasting/base/adapters/_neuralforecast.py Outdated Show resolved Hide resolved
sktime/forecasting/neuralforecast.py Show resolved Hide resolved
@yarnabrina
Copy link
Collaborator

I see a lot of jobs were cancelled, do you know why? Can you please rebase once from main (if not done already), and trigger the tests again?

@fkiraly
Copy link
Collaborator

fkiraly commented Apr 11, 2024

I see a lot of jobs were cancelled, do you know why? Can you please rebase once from main (if not done already), and trigger the tests again?

That might have been me, I cancelled the PRs prior to the fix of the failures on main in #6280, and probably forgot to restart this one.

Copy link
Collaborator

@yarnabrina yarnabrina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added few minor comments, please address those and we can merge this PR then today before tomorrow's scheduled full run (UTC 12 on Sunday) on main.

Other than the below concepts, I've no more crucial feedbacks, except few very minor changes. Those can be done in a separate PR afterwards. Unless @fkiraly or @jmoralez have any other concerns, I'll be happy to merge after the following changes.

sktime/forecasting/tests/test_neuralforecast.py Outdated Show resolved Hide resolved
sktime/forecasting/base/adapters/_neuralforecast.py Outdated Show resolved Hide resolved
sktime/forecasting/base/adapters/_neuralforecast.py Outdated Show resolved Hide resolved
@yarnabrina yarnabrina merged commit 4018337 into sktime:main Apr 14, 2024
54 checks passed
toandaominh1997 pushed a commit to toandaominh1997/sktime that referenced this pull request Apr 18, 2024
As discussed on discord I have updated the neuralforecast models to
support the optimizer parameter. Just a question to @jmoralez if I don't
pass the `optimizer` and pass the `optimizer_kwargs` will it overide the
kwargs of default optimizer or just ignore the kwargs, as it doesn't
raise error when `optimizer_kwargs` are passed without `optimizer`.

---------

Co-authored-by: Franz Király <f.kiraly@ucl.ac.uk>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Adding new functionality module:forecasting forecasting module: forecasting, incl probabilistic and hierarchical forecasting
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants