Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

using transfer learning model on resnet18 #13

Open
xhm1014 opened this issue Feb 4, 2020 · 3 comments
Open

using transfer learning model on resnet18 #13

xhm1014 opened this issue Feb 4, 2020 · 3 comments

Comments

@xhm1014
Copy link

xhm1014 commented Feb 4, 2020

I build a resnet18 transfer learning model following the example in weblink: https://pytorch.org/tutorials/beginner/transfer_learning_tutorial.html.

My program encounters the error when running to:
meta_model.update_params(hyperparameters['lr'], source_params=grads)
The error is: TypeError: cannot assign 'torch.cuda.FloatTensor' as parameter 'weight' (torch.nn.Parameter or None expected)

Anyone has ideas about the error I encountered? Thanks

Could you give the exact link for: class MetaModule(nn.Module): ?
I have not found it from: Adrien Ecoffet https://github.com/AdrienLE

@souxun2015
Copy link

You need to change the code. You can find it from here

@Yurushia1998
Copy link

You need to change the code. You can find it from here

I had change nn.Module to MetaModule in PreActResNetMeta class and linear line in that class according to the instruction, but there is still similar error. Did you change anything else?

@zlzenglu
Copy link

zlzenglu commented Aug 9, 2021

I have encountered the same error, and I have figured out why: you need to rewrite all nn functions including trainable parameters in ResNet layers to "Meta"-like style. Just like MetaConv2d, MetaLinear, you need to write a new MetaBatchnorm function and replace the original nn functions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants