Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why are inconsistent parameters set in general model config and fine-tune model config? #22

Closed
EchoTHChen opened this issue Nov 13, 2022 · 2 comments

Comments

@EchoTHChen
Copy link

EchoTHChen commented Nov 13, 2022

(1)
In configs/train/gen/train_gen_cost_volume_train.yaml, use_vis is false.

fine_dist_decoder_cfg:
    use_vis: false

while, in configs/train/ft/neuray_ft_cv_lego.yaml, the use_vis of fine_dist_decoder_cfg is set as the default value of dist_decoder: true.

This will cause the pretrained general model is not successfully loaded in the fine-tune model because of the different setting of vis-encoder in dist-decoder. So what should use_vis of fine_dist_decoder_cfg be ?

(2) I noticed that use_self_hit_prob is only set to true in the fine-tune-model. So why not set it to true consistently in the general-model?

@liuyuan-pal
Copy link
Owner

liuyuan-pal commented Nov 14, 2022

  1. Hi, both of them are supposed to be set to False. However, the pre-trained model uses the fine_dist_decoder_cfg with use_vis=True. This is a historical problem, in which I used to apply an additional "visibility" (not the visibility of NeuRay) term on the hit probability distribution. Such a "visibility" term almost has no effect on the rendering quality. So I set it to False afterward. However, it seems that I forget to delete them in pretrained models. Sorry for the messy codes.
  2. use_self_hit_prob is used in training of fine-tuning not for the generalization model, which means we will decode the hit prob on the pseudo test views during training. In generalization training, such test views are not pseudo but real unseen test views.

@EchoTHChen
Copy link
Author

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants