Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

finetune能不能加个功能 #325

Closed
404289680 opened this issue Oct 26, 2020 · 2 comments
Closed

finetune能不能加个功能 #325

404289680 opened this issue Oct 26, 2020 · 2 comments

Comments

@404289680
Copy link

Traceback (most recent call last):
File "./fastreid/engine/train_loop.py", line 125, in train
self.run_step()
File "./fastreid/engine/train_loop.py", line 244, in run_step
self.optimizer.step()
File "/home/zuwei/anaconda3/lib/python3.8/site-packages/torch/optim/lr_scheduler.py", line 67, in wrapper
return wrapped(*args, **kwargs)
File "/home/zuwei/anaconda3/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 15, in decorate_context
return func(*args, **kwargs)
File "./fastreid/solver/optim/adam.py", line 102, in step
exp_avg.mul_(beta1).add_(grad, alpha=1 - beta1)
RuntimeError: The size of tensor a (50000) must match the size of tensor b (25754) at non-singleton dimension 0
类别数不一样,无法finetune

@L1aoXingyu
Copy link
Member

fixed.
You can finetune from the other trained model by calling

python3 tools/train_net.py --config-file CONFIG_FILE_PATH --num-gpus 2 MODEL.WEIGHTS PRETRAIN_MODEL_PATH

@DeepAlchemist
Copy link

If the current model is partially different from the trained model by adding some parameters, it raises errors.

截屏2020-11-23 下午10 09 34

BTW, I can not figure out the differences between --finetune, resume=True, and resume=False.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants