Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update the pretrained VGG16 or not? #9

Closed
o0t1ng0o opened this issue Sep 29, 2021 · 2 comments
Closed

Update the pretrained VGG16 or not? #9

o0t1ng0o opened this issue Sep 29, 2021 · 2 comments

Comments

@o0t1ng0o
Copy link

Hi, @lyndonzheng
I have a question about the optimization of pretrained VGG16 when training the Learned SeSim model.
As shown in the following code, the learning rate multiplies zero.
I am wondering whether the pretrained VGG16 will update its parameters during training?

self.optimizer_F = torch.optim.Adam([{'params': list(filter(lambda p:p.requires_grad, self.netPre.parameters())), 'lr': self.opt.lr*0.0},

Thank you in advance.

@lyndonzheng
Copy link
Owner

Hi @o0t1ng0o, we did not update weights of pre-trained VGG16. Indeed, we only select suitable features using the contrastive learning from the pre-trained VGG model. We tried to update all the weights using the contrastive learning, while the performance is unstable under current design.

@o0t1ng0o
Copy link
Author

Thank you for your prompt reply.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants