Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to continue fine tuning of model? #10

Closed
TaoTeCha opened this issue Jun 21, 2021 · 2 comments
Closed

How to continue fine tuning of model? #10

TaoTeCha opened this issue Jun 21, 2021 · 2 comments

Comments

@TaoTeCha
Copy link

TaoTeCha commented Jun 21, 2021

How can I continue training on one of my finetuned models? I'm using google colab so it was only about to run to 36k steps before stopping. I see that there is a generator and a discriminator model, how will this work when continuing fine tuning? Do I just load in the generator and lose the training for the discriminator?

I tried to start again, just feeding the path of my finetuned G model instead of ljs_base, but the quality is considerably worse and seems like it started training nearly from the beginning.

Thanks!

@TaoTeCha TaoTeCha changed the title Labels for training output values? How to continue fine tuning of model? Jun 22, 2021
@TaoTeCha
Copy link
Author

Nevermind. Since I am using google colab I had to change model_dir, and on the second run I pointed it to a different folder than the first. Just don't download the base model and use the same model_dir. It will pick up the checkpoints.

@ToiYeuTien
Copy link

Hi !
I have trained on my computer a Vietnamese female voice model in 500k steps. and I found the voice quite clear. I want to train another male Vietnamese voice.
I learned there is a training method based on a previously trained model, which will shorten the training time.
Can you help me with that method.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants