Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add SOTA optimisers #61

Open
GilesStrong opened this issue Jun 6, 2020 · 7 comments
Open

Add SOTA optimisers #61

GilesStrong opened this issue Jun 6, 2020 · 7 comments
Labels
enhancement New feature or request good first issue Good for newcomers low priority Not urgent and won't degrade with time

Comments

@GilesStrong
Copy link
Owner

There was a big kerfuffle in 2019 about some new optimisers: Regularised Adam (Liu et al., 2019), Look Ahead (Zhang, Lucas, Hinton, & Ba, 2019), and a combination of both of them, Ranger (which also now includes Gradient Centralization (Yong, Huang, Hua, & Zhang, 2020).

Having tried these, (except the latest version of Ranger), I'vbe not found much improvement compared to Adam, but this was only on one dataset. The performance of Ranger, though, looks to be quite good for other datasets, so perhaps it is useful.

User-defined optimisers can easily be used in LUMIN, by passing the partial optimiser to the opt_args argument of ModelBuilder, e.g. opt_args = {'eps':1e-08, 'opt':partial(RAdam)}. It could be useful, however, to include the optimisers in LUMIN, to allow them to be easily used, without the user having to include an copied code.

These git repos include Apache 2.0 - licensed implementations of Radam and Ranger, so inclusion should be straight forward.

@GilesStrong GilesStrong added enhancement New feature or request good first issue Good for newcomers low priority Not urgent and won't degrade with time labels Jun 6, 2020
@kiryteo
Copy link
Contributor

kiryteo commented Aug 2, 2021

@GilesStrong Can we try https://github.com/jettify/pytorch-optimizer (they provide both RAdam and Ranger along with few more optimizers). Since lumin uses torch.optim to load optimizers, adding pytorch-optimizer as an installation requirement and using that in ModelBuilder can be a quick solution.

@GilesStrong
Copy link
Owner Author

Hi @kiryteo , thanks for looking into this! Since writing this issue, I've tried the new optimisers out in various different tasks, but so far haven't found them to be significantly beneficial. Based on this, I would be hesitant to include them in LUMIN, or add an extra dependence.
Like you say, they can easily be included by the user, so I think for now we can leave it as is, and add to one of the examples a demonstration of how custom optimisers can be used. What do you think?

@kiryteo
Copy link
Contributor

kiryteo commented Aug 4, 2021

I agree, it could be an additional dependency and instead an example for custom addition could be useful as well. Are you considering a jupyter notebook or simple section addition in the README?
Let me know!

@GilesStrong
Copy link
Owner Author

I think in the "Single_Target_Regression_Di-Higgs_mass_prediction.ipynb" example, Ranger could be used as a custom optimiser, with some text emphasising that a custom optimiser is being used. (Either pip-install pytorch-optimizer within the notebook, or copy the ranger source code and their licence header into the notebook.
In the Readme, perhaps we could have a line stating that custom optimisers can be included without much hassle.

@kiryteo
Copy link
Contributor

kiryteo commented Aug 6, 2021

Alright, thanks! I'm trying to add this now but it seems that the notebook has some issues with uproot as I get error at loading the ROOTTree as DataFrame
df = uproot.open(PATH/'signal.root')['tree'].pandas.df()

@GilesStrong
Copy link
Owner Author

Thanks for looking into this. Uproot 4 got released with breaking API changes. If you change the pip install line in the first code cell to # !pip install lumin uproot~=3.0 then it should install the last 3.x version of Uproot and work.

@kiryteo
Copy link
Contributor

kiryteo commented Aug 10, 2021

Thanks, I'll proceed with this!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request good first issue Good for newcomers low priority Not urgent and won't degrade with time
Projects
None yet
Development

No branches or pull requests

2 participants