Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

from utils import load_pretrain #4

Closed
ZhouHongLiang6 opened this issue Jan 21, 2023 · 3 comments
Closed

from utils import load_pretrain #4

ZhouHongLiang6 opened this issue Jan 21, 2023 · 3 comments

Comments

@ZhouHongLiang6
Copy link

ZhouHongLiang6 commented Jan 21, 2023

I can‘t find the utils files,when I run the rode,it tell me it can't find the load_pretrain,can you help me?

@ZhouHongLiang6 ZhouHongLiang6 changed the title utils and your network from utils import load_pretrain Jan 21, 2023
@TWENTY-FOU
Copy link

同问
解决了可以艾特一下我嘛?

@TWENTY-FOU
Copy link

邮箱1002367554@qq.com

@Kahsolt
Copy link

Kahsolt commented Feb 21, 2023

That load_pretrain seems redundant and can be just safely removed, see line 104~106 for the real load of pretrained weights:

if int(config['pretrained']):
    Net.load_state_dict(torch.load(config['saved_model'], map_location='cpu')['model_weights'])
    best_val_loss = torch.load(config['saved_model'], map_location='cpu')['val_loss']

And also, when it comes to error missing xcit_tiny_12_p16, just comment that two lines, they are NOT EVEN USED.
And then, when trainer is a module and cannot be called, modify its import clause from trainer import trainer.
And finally, I got AssertionError: Input image size (256*256) doesn't match model (224*224). only to find that the pretrained model is sized 244 and cannot even apply on ANY DATASET PRESETS sized in 256 as README.md declares!!!!!

After all, I DO NOT really know why this repo is soooo broken...
Is this really an official implementation for a published or ready-to-publish work??
It upsets me a bit :(

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants