Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The training details #4

Closed
JianghaoWu opened this issue Jan 11, 2022 · 1 comment
Closed

The training details #4

JianghaoWu opened this issue Jan 11, 2022 · 1 comment

Comments

@JianghaoWu
Copy link

Thank you very much for your work.
I have some questions:
when unc_noise is False, ie. the model is deepLab(13, False) in your code, is pretrained and it includes encoder and mian decoder right?
Are the parameters of the auxiliary decoder copied from the main decoder or initialized randomly?
When training, update the parameters of the encoder and the auxiliary decoder and keep only the parameters of the main decoder unchanged?
When training, the pseudo-label is generated by the pre-trained model, and it is not updated in the later training process, right?
I'd be grateful if you could answer it!

@prabhuteja12
Copy link
Collaborator

Hi,

  • Is it pretrained: yes.
  • Params of auxilary decoder copied?: Not in the final version. I remember trying this out, but I don't think it mattered much.
  • Only params of aux decoders are trained: Yes. because the aux decoders are random initialized, they wont be able to predict labels correctly with the used feature representation.
  • Updating pseudolabels: They are updated after each epoch. I based my implementation of this on CRST, and you might find it useful to follow that too.

Hope this answers your questions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants