New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About training on the winter2summer dataset #12
Comments
Hi @ZhenyuLiu-BJFU, thanks for your interesting. I have not trained the winter2summer on this code. I used the MUNIT(https://github.com/NVlabs/MUNIT) for winter2summer translation. |
Hi @ZhenyuLiu-BJFU I have not noticed you training epoch before. Generally, the default setting would be 200 times for fixed unpaired I2I translation, while the learnable one following the setting of CUT that use 400 epochs during the training. |
Thanks for your explanation. |
Hello!
Thank you for doing such a good job.
And I am trying to train it on the winter2summer dataset, the settings are as yours.
I use the command --dataroot ./datasets/winter2summer --name winter2summer_SCL --model sc --learned_attn --augment to train the LSeSim model. I trained it about 50epoch and like this
the FID=104.3
But the effect is not as good as shown in your paper
Could you tell me what's wrong with me?
Thank you!
The text was updated successfully, but these errors were encountered: