Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

why pretrain? #9

Closed
AndyYuan96 opened this issue Mar 1, 2021 · 1 comment
Closed

why pretrain? #9

AndyYuan96 opened this issue Mar 1, 2021 · 1 comment

Comments

@AndyYuan96
Copy link

Hi, @Na-Z ,thanks for opening source the code, I have two question after reading the paper.
1.I see that you say "Note that our SESS is initialized by the VoteNet weights pre-trained on the corresponding labeled data", I just wonder why you use pretrained weight, does the result that training from scratch is bad? Or using pretrained weight is a common method in semisupervised training? What's more, I just advise that you can use the pre-trained weight trained from SUN and train SESS with ScanNetV2 or reverse,I just think this is the right way to use pretrained weight, as we don't have a pretrained weight on a new collected dataset without label in reality.
2.Does more unlabeled data, more better performance? how is the performance if you training with all train,val,test data?

@Na-Z
Copy link
Owner

Na-Z commented Mar 2, 2021

  1. a) Yes. The result training from scratch is not good. b) Pre-training the backbone with labeled data will make the following training easier, which is also suggested by original mean-teacher work. c) Our SESS is proposed for semi-supervised learning setting, where there always have a group of labeled samples. Of course, it would be interesting if we can use pretrained weights from other datasets, albeit some other problems such as domain adaptation might need to be solved.
  2. a) Theoretically, if more unlabeled data follows a wider distribution, we should be able to achieve better performance. b) We did not conduct such an experiment. You are encouraged to explore if interested (Please let me know the result if you are willing to do this :P).

@Na-Z Na-Z closed this as completed Mar 15, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants