New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use another generator pretrained weight #14
Comments
You could try and see what happens during inference. The reason that you will probably get different results is that the SAM encoder was trained with respect to the stylegan-ffhq-config-f model you mentioned. Therefore, if you change the generator, you will change the latent space as well and therefore get different results (most likely). However, this is worth a try since it requires minimal work. If this doesn't work, then yes, you can try re-training SAM with your generator, perhaps starting from the pre-trained SAM encoder (rather than training from scratch). |
Thank you for your help, sir! |
Oh I have one more question: |
The pSp was taken from the official pSp repository. I don't recall how many iterations we trained pSp. |
Thank you, sir. |
Hi sir!
I have a checkpoint file .pkl and then I use Rosinality to convert it into .pt. So I have some confusion:
My question is could I use this model for inference by SAM? I knew that when training you saved stylegan-ffhq-config-f.pt into checkpoint sam.pt.
Could I load the sam.pt model and change the state_dict of stylegan-ffhq-config-f path to my stylegan model? This idea comes from the fact that you freeze your generator when training, so that the weight of generator does not affect the final SAM weight for inference.
Or should I retrain the SAM model with my own generator model?
Or should I retrain styleGAN encode in pSp repo?
Thank you so much sir.
The text was updated successfully, but these errors were encountered: