Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it OK that the training loss kept around 5.8 when steps goes up to 250K? #79

Closed
butterl opened this issue May 31, 2018 · 4 comments
Labels

Comments

@butterl
Copy link

butterl commented May 31, 2018

Hi , I used thchs30 dataset to train wavenet and the loss kept around 5.8 when steps goes up to 200K, I'm not sure if this is normal and if not, any suggestion to make the loss down?

The evaluation wav got may noise in it and if this wave a good one ? ( I think the wav different need to be less, but the below seem not fine)

step000250000_waveplots

@begeekmyfriend
Copy link

I am afraid you need to wait until 800K steps #41

@butterl
Copy link
Author

butterl commented May 31, 2018

Any hparam suggestion? I only changed the worker num and batch size to fit memory

@WendongGan
Copy link

I train it on the data of Ljspeech. After trained over 600k, the loss decrease slowly. I want to know how low can the loss be.
image

@stale
Copy link

stale bot commented May 30, 2019

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the wontfix label May 30, 2019
@stale stale bot closed this as completed Jun 6, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants