You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Mar 19, 2024. It is now read-only.
❓ Is it possible to get full imagenet pretrained weights of MoCo v2 ?
Hello,
Thank you for this nice library.
I am trying to get the entire weights of resnet50 pretrained with moco v2 on ImageNet dataset. That is, I need the two encoders, the head etc ... so that I can resume the training with moco.
Is is possible to do this using the weights from the model zoo or somewhere else ?
I downloaded the weights and it does not seem to contain the weights of the two encoders.
The text was updated successfully, but these errors were encountered:
CharlieCheckpt
changed the title
Is it possible to get full imagenet pretrained weights of MoCo v2
Is it possible to get full imagenet pretrained weights of MoCo v2 ?
Oct 3, 2022
Those weights are indeed complete and contain the weights of the trunk, the head, the momentum encoder, the queue, basically everything needed to restart training from there.
cp = torch.load("/path/to/model_final_checkpoint_phase199.torch")
# head
cp["classy_state_dict"]["base_model"]["model"]["heads"].keys()
# trunk
cp["classy_state_dict"]["base_model"]["model"]["trunk"].keys()
# loss (containing the momentum encoder)
cp['loss'].keys()
Please tell me if this answers your question,
Thank you,
Quentin
❓ Is it possible to get full imagenet pretrained weights of MoCo v2 ?
Hello,
Thank you for this nice library.
I am trying to get the entire weights of resnet50 pretrained with moco v2 on ImageNet dataset. That is, I need the two encoders, the head etc ... so that I can resume the training with moco.
Is is possible to do this using the weights from the model zoo or somewhere else ?
I downloaded the weights and it does not seem to contain the weights of the two encoders.
The text was updated successfully, but these errors were encountered: