You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I used DINO to do self-supervised pre-training of a Small ViT on a dataset I have. Now I wanted to fine-tune the model on another dataset in a supervised way.
I know that, in a way, the code in eval_linear.py allows us to do that, but - as far as I was able to tell - it only updates the weights of the Linear model built on top of the representations generated by the pre-trained Transformer.
So my question is: has anyone tried to perform supervised fine-tuning in a way that the weights of the Teacher or Student Transformers are updated as well?
PS: I realize this might not be the ideal place to ask this question, since it sort of falls out of the DINO jurisdiction, but I figured it was worth a try.
Thanks for the amazing you work you guys did, and for sharing it with us.
The text was updated successfully, but these errors were encountered:
I used DINO to do self-supervised pre-training of a Small ViT on a dataset I have. Now I wanted to fine-tune the model on another dataset in a supervised way.
I know that, in a way, the code in eval_linear.py allows us to do that, but - as far as I was able to tell - it only updates the weights of the Linear model built on top of the representations generated by the pre-trained Transformer.
So my question is: has anyone tried to perform supervised fine-tuning in a way that the weights of the Teacher or Student Transformers are updated as well?
PS: I realize this might not be the ideal place to ask this question, since it sort of falls out of the DINO jurisdiction, but I figured it was worth a try.
Thanks for the amazing you work you guys did, and for sharing it with us.
The text was updated successfully, but these errors were encountered: