Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Universal Multiple Optimizer Wrapper with Layer Assignment #42070

Open
hyang0129 opened this issue Aug 5, 2020 · 0 comments
Open

Universal Multiple Optimizer Wrapper with Layer Assignment #42070

hyang0129 opened this issue Aug 5, 2020 · 0 comments
Assignees
Labels
comp:keras Keras related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower type:feature Feature requests

Comments

@hyang0129
Copy link

hyang0129 commented Aug 5, 2020

I am proposing a universal wrapper for a layer-wise set of distinct optimizers. This will allow each optimizer in a set of optimizers to apply the gradient to its layer-wise variables. This will enable discriminative layer training. It will also enable any training method that applies any combination of optimizers to any combination of layers at any combination of hyperparameters for those optimizers.

The optimizer wrapper will consume a list of instantiated optimizers and layers, referred to as an optimizer spec. For each optimizer spec, it will allocate the correct gradients and variables and then call the apply_gradients method for the optimizer. This allows the optimizer wrapper to leverage all implementations of optimizer specific operations, notably the resource apply methods.

A prototype of the optimizer wrapper is available at this link. link

The prototype works on both TPU and CPU.

I am willing to contribute this code.

This will not change any existing classes. Instead, it will act as a wrapper that allows a model to use multiple optimizers.

Discriminative layer training is most beneficial for fine tuning pretrained models. Users looking to apply existing technology will benefit in reduced training time and improved transfer learning.

@hyang0129 hyang0129 added the type:feature Feature requests label Aug 5, 2020
@hyang0129 hyang0129 changed the title Universal Multiple, Layer-wise Optimizer Wrapper Universal Multiple Optimizer Wrapper with Layer Assignment Aug 5, 2020
@Saduf2019 Saduf2019 added the comp:keras Keras related issues label Aug 6, 2020
@Saduf2019 Saduf2019 assigned jvishnuvardhan and unassigned Saduf2019 Aug 6, 2020
@jvishnuvardhan jvishnuvardhan added the stat:awaiting tensorflower Status - Awaiting response from tensorflower label Aug 7, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:keras Keras related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower type:feature Feature requests
Projects
None yet
Development

No branches or pull requests

4 participants