Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement Transfer Learning API / Instructions #69

Closed
lapp0 opened this issue Mar 22, 2022 · 5 comments
Closed

Implement Transfer Learning API / Instructions #69

lapp0 opened this issue Mar 22, 2022 · 5 comments

Comments

@lapp0
Copy link

lapp0 commented Mar 22, 2022

It would be helpful if we could piggy back on your library of pre-trained model for transfer learning. Perhaps this may be accomplished by freezing the first 6 (2L) of 9 layers of Mask2Formers transformer decoder.

Usage may look like this:

export DETECTRON2_DATASETS=/path/to/dir/containing/new/dataset

python train_net.py \
  --config-file <pretrained model config> \
  --pretrained-model /path/to/checkpoint_file
  --transfer-learning-dataset <new dataset>
@bowenc0221
Copy link
Contributor

I did not get the question. You can always finetune the model on your own dataset by registering it in Detectron2.

@lapp0
Copy link
Author

lapp0 commented Mar 24, 2022

Sorry I wasn't clear.

What I'm requesting is a feature similar to keras' trainable attribute which freezes the weights and biases of a layer if false. https://keras.io/guides/transfer_learning/#the-typical-transferlearning-workflow

I would like to use one of Mask2Formers pre-trained COCO models to learn features of smaller datasets.

@bowenc0221
Copy link
Contributor

In PyTorch, you can freeze any parameter by setting its require_grad to False.

@lapp0 lapp0 closed this as completed Mar 31, 2022
@lapp0
Copy link
Author

lapp0 commented Mar 31, 2022

I will experiment with freezing different layers. If I have any success I'll make a PR.

@jkim50104
Copy link

@lapp0 Where did you add the require_grad False to freeze the layers? And just from curiosity any results about freezing different layers?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants