Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use another generator pretrained weight #14

Closed
vinduon opened this issue Jun 9, 2021 · 5 comments
Closed

Use another generator pretrained weight #14

vinduon opened this issue Jun 9, 2021 · 5 comments

Comments

@vinduon
Copy link

vinduon commented Jun 9, 2021

Hi sir!

I have a checkpoint file .pkl and then I use Rosinality to convert it into .pt. So I have some confusion:

  1. My question is could I use this model for inference by SAM? I knew that when training you saved stylegan-ffhq-config-f.pt into checkpoint sam.pt.

  2. Could I load the sam.pt model and change the state_dict of stylegan-ffhq-config-f path to my stylegan model? This idea comes from the fact that you freeze your generator when training, so that the weight of generator does not affect the final SAM weight for inference.

  3. Or should I retrain the SAM model with my own generator model?

  4. Or should I retrain styleGAN encode in pSp repo?

Thank you so much sir.

@vinduon vinduon changed the title Use another style-gan-config.pt Use another generator pretrained weight Jun 9, 2021
@yuval-alaluf
Copy link
Owner

2. Could I load the sam.pt model and change the state_dict of stylegan-ffhq-config-f path to my stylegan model?

You could try and see what happens during inference. The reason that you will probably get different results is that the SAM encoder was trained with respect to the stylegan-ffhq-config-f model you mentioned. Therefore, if you change the generator, you will change the latent space as well and therefore get different results (most likely). However, this is worth a try since it requires minimal work.

If this doesn't work, then yes, you can try re-training SAM with your generator, perhaps starting from the pre-trained SAM encoder (rather than training from scratch).

@vinduon
Copy link
Author

vinduon commented Jun 10, 2021

Thank you for your help, sir!

@vinduon
Copy link
Author

vinduon commented Jun 10, 2021

Oh I have one more question:
How many iterations you trained your SAM model, and pSp also?
I see that the maximum is 500,000 but maybe it is sooner right?

@yuval-alaluf
Copy link
Owner

The pSp was taken from the official pSp repository. I don't recall how many iterations we trained pSp.
And SAM was trained for about 60,000 iterations.

@vinduon
Copy link
Author

vinduon commented Jun 10, 2021

Thank you, sir.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants