Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error with using Pytorch Lr Scheduler #133

Closed
tanulsingh opened this issue Jun 6, 2020 · 4 comments
Closed

Error with using Pytorch Lr Scheduler #133

tanulsingh opened this issue Jun 6, 2020 · 4 comments
Labels
Abhishek-eBook Label your issue with this and try to win a free copy of Abhishek's edbook

Comments

@tanulsingh
Copy link

I am trying to use ReduceOnplateau lr scheduler with TabnetRegressor and I am getting the following error:
step() missing 1 required positional argument: 'metrics'

I don't find any argument to pass in the metrics or something I even went through the code of Tabnet Help would be appreciated
Thanks in advance

@Optimox
Copy link
Collaborator

Optimox commented Jun 6, 2020

Hello @tanulsingh,

Thanks for opening this issue.
At the moment the only LRschedulers that you can use are ones that don't depend on the current epoch results (so ReduceOnPlateau won't work). You can use any LR that decays in a pre-defined manner.

I know this is something that we should improve, this should come with the callbacks method I think #123

So for now I suggest you to switch to torch.optim.lr_scheduler.StepLR or torch.optim.lr_scheduler.CosineAnnealingLR or any scheduler that does not need information about current metrics.

@tanulsingh
Copy link
Author

Hey @Optimox I believe I can make those lr schedulers work by inheriting from the tabnet model base class and working on that . BTW I was able to successfully use custom loss with Tabnet . Now I plan on using lr schedulers and TPU's with tabnet . If I am successful I will update the results here . Also I was able to achieve 0.1620 on public lb in trends competition which is State of the art without any fine tuning , I will be releasing a public kernel today and I plan to finetune and use every ounce of juiceof tabnet on trends as I believe it has got a lot Of potential . I also wanted to nominate myself for abhishek's book I don;t know how to do that though

@Optimox
Copy link
Collaborator

Optimox commented Jun 7, 2020

@tanulsingh there is no fundamental reason not to be able to use custom loss or lr schedulers, it's just that the implementation does not give easy access to this yet.

If you spend some time changing the code to enable some functionnalities don't hesitate to create a PR so that I can review and potentially merge it.

Very happy to see that you are able to achieve good results with tabnet.

In order to be eligible to Abhishek's ebook the only things to do is to open an issue and put Abhishek's eBook label on it, I'll add the label to this issue so you don't have anything to do.

@Optimox Optimox added the Abhishek-eBook Label your issue with this and try to win a free copy of Abhishek's edbook label Jun 7, 2020
@neomatrix369
Copy link

Hey @Optimox I believe I can make those lr schedulers work by inheriting from the tabnet model base class and working on that . BTW I was able to successfully use custom loss with Tabnet . Now I plan on using lr schedulers and TPU's with tabnet . If I am successful I will update the results here . Also I was able to achieve 0.1620 on public lb in trends competition which is State of the art without any fine tuning , I will be releasing a public kernel today and I plan to finetune and use every ounce of juiceof tabnet on trends as I believe it has got a lot Of potential . I also wanted to nominate myself for abhishek's book I don;t know how to do that though

@tanulsingh I don't think you can use tabnet with TPUs yet, unless you have found a way to do it already see my discussion #129 ;)

@Optimox Optimox closed this as completed Jul 3, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Abhishek-eBook Label your issue with this and try to win a free copy of Abhishek's edbook
Projects
None yet
Development

No branches or pull requests

3 participants