Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to Use Different Optimizers with NeuralForecast Models #852

Closed
Amirh63 opened this issue Jan 8, 2024 · 13 comments
Closed

How to Use Different Optimizers with NeuralForecast Models #852

Amirh63 opened this issue Jan 8, 2024 · 13 comments
Labels

Comments

@Amirh63
Copy link

Amirh63 commented Jan 8, 2024

Description

Greetings Gentlemen,

I am currently working with the NeuralForecast library and specifically interested in training models using the Adam optimizer. I've gone through the documentation but haven't found clear guidance on how to integrate a custom optimizer, like Adam, into the training process.

Could you please provide some insights or examples on how to set up the Adam optimizer for training NeuralForecast models? I'm particularly interested in any necessary configurations or modifications needed within the library's framework to achieve this.

Additionally, if there are any best practices or recommendations for using custom optimizers with NeuralForecast, that information would be greatly appreciated.

Thank you for your time and assistance.

Link

No response

@Amirh63 Amirh63 changed the title [<Library component: Models|Core|etc...>] How to Use Adam Optimizer with NeuralForecast Models How to Use Adam Optimizer with NeuralForecast Models Jan 8, 2024
@candalfigomoro
Copy link
Contributor

Adam is the optimizer that is already being used

@Amirh63
Copy link
Author

Amirh63 commented Jan 8, 2024

How do I use a different optimizer other than Adam?

@Amirh63 Amirh63 changed the title How to Use Adam Optimizer with NeuralForecast Models How to Use Different Optimizers with NeuralForecast Models Jan 8, 2024
@cchallu
Copy link
Contributor

cchallu commented Jan 12, 2024

Hi @Amirh63. There is currently no option to modify the optimizer using the hyperparameters. We fixed the optimizer to Adam, given that all papers proposing these models use it. But this can be a great addition, so we will add it to our future improvements. In the meantime, you can modify it manually if you clone the repo and modify the following:

In line 181 on _base_windows.py:

def configure_optimizers(self):
        optimizer = torch.optim.Adam(self.parameters(), lr=self.learning_rate)
        scheduler = {
            "scheduler": torch.optim.lr_scheduler.StepLR(
                optimizer=optimizer, step_size=self.lr_decay_steps, gamma=0.5
            ),
            "frequency": 1,
            "interval": "step",
        }
        return {"optimizer": optimizer, "lr_scheduler": scheduler}

@cchallu
Copy link
Contributor

cchallu commented Feb 14, 2024

@quest-bot stash 200

@quest-bot quest-bot bot added the ⚔️ Quest Tracks quest-bot quests label Feb 14, 2024
Copy link

quest-bot bot commented Feb 14, 2024

New Quest! image New Quest!

A new Quest has been launched in @Nixtla’s repo.
Merge a PR that solves this issue to loot the Quest and earn your reward.


Loot of 200 USD has been stashed in this issue to reward the solver!

🗡 Comment @quest-bot embark to check-in for this Quest and start solving the issue. Other solvers will be notified!

⚔️ When you submit a PR, comment @quest-bot loot #852 to link your PR to this Quest.

Questions? Check out the docs.

@cchallu
Copy link
Contributor

cchallu commented Feb 14, 2024

Solution:

  • Add optimizer hyperparameter on all models. The default should be None, and it recieves a custom torch.optim object.
  • If user adds an optimizer, pass it to the configure_optimizers function (see above) and use instead of default. Repeat for all base classes.
  • Add a test on core.ipynb that when user passes a different optimizer, results are different.

@helenehanyu
Copy link

Hi @cchallu , can you make scheduler an option to?

@JQGoh
Copy link
Contributor

JQGoh commented Feb 25, 2024

@quest-bot embark

Copy link

quest-bot bot commented Feb 25, 2024

@JQGoh has embarked on their Quest. 🗡

  • @JQGoh has been on GitHub since 2014.
  • They have merged 0 public PRs in that time.
  • Their swords are blessed with Python and Shell magic ✨
  • They haven't contributed to this repo before.

Questions? Check out the docs.

Copy link

quest-bot bot commented Feb 25, 2024

🧚 @JQGoh has submitted PR #901 and is claiming the loot.

Keep up the pace, or you'll be left in the shadows.

Questions? Check out the docs.

@JQGoh
Copy link
Contributor

JQGoh commented Mar 1, 2024

Hi @cchallu If you agree that scheduler could be considered as an option and the implementation can follow #901 , I could create a separate issue and try to implement that. Let me know what do you think about this.
cc: @helenehanyu

@cchallu
Copy link
Contributor

cchallu commented Mar 8, 2024

@quest-bot reward @JQGoh

@cchallu cchallu closed this as completed Mar 8, 2024
Copy link

quest-bot bot commented Mar 8, 2024

Quest solved! image Quest solved!

Congratulations! Your efforts have paid off. A PR that solves this Quest has been merged.


@JQGoh, you have been victorious in Quest #852 🗡

💰 To claim your $200 reward follow instructions here

🧚 Thanks to all bold adventurers for attempting this quest! Sign-up to Quine.sh to access more exciting Quests like this one ⚔️

This Quest is now closed ✨

Questions? Check out the docs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

5 participants