Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error when some layers are frozen #46

Closed
arunpatala opened this issue Jul 10, 2017 · 5 comments
Closed

Error when some layers are frozen #46

arunpatala opened this issue Jul 10, 2017 · 5 comments

Comments

@arunpatala
Copy link

When I am trying to finetune a pretrained network, I am freezing some layers params using require_grad=False. The optimizer is trying to optimize all params causing
ValueError: optimizing a parameter that doesn't require gradients. Is there are a way to only pass params that have require_grad as True.
Thanks

@arunpatala
Copy link
Author

I set the arg optimizer_parameters=model.fc.parameters() in trainer.compile.
But i guess it is being passed to optimizer initialization:

line 133, in set_optimizer
TypeError: init() got an unexpected keyword argument 'parameters'

@arunpatala
Copy link
Author

arunpatala commented Jul 10, 2017

added kwargs.pop('parameters',None) after line 128 and it works as expected

@ncullen93
Copy link
Member

Oh I see. Wait can you tell me where in this function you added that line (i'd appreciate it):

    def set_optimizer(self, optimizer, **kwargs):
        if type(optimizer) is type or isinstance(optimizer, str):
            if 'parameters' in kwargs:
                parameters = kwargs['parameters']
            else:
                parameters = self.model.parameters()

            optimizer = _validate_optimizer_input(optimizer)
            self._optimizer = optimizer(parameters, **kwargs)
        else:
            self._optimizer = optimizer

@recastrodiaz
Copy link
Contributor

I guess there is a better way, but the following works for me:

# 1. Set some layers of the model to be non-trainable (param.requires_grad = False)
# 2. Monkey patch the instance parameters() method to return trainable weights only
import types

def parameters(self):
    p = filter(lambda p: p.requires_grad, nn.Module.parameters(self))
    return p

model.parameters = types.MethodType(parameters, model)

# 3. Profit
trainer.fit_loader(train_loader, val_loader=val_loader, nb_epoch=1)

@arunpatala
Copy link
Author

def set_optimizer(self, optimizer, **kwargs):
    if type(optimizer) is type or isinstance(optimizer, str):
        if 'parameters' in kwargs:
            parameters = kwargs['parameters']
            kwargs.pop('parameters',None)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants