-
Notifications
You must be signed in to change notification settings - Fork 226
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question about start point of SDR #9
Comments
Hi, @lycox1 As discussed in #5, SDR may significantly differ from results in README since it's measured from the random sample. Please refer to Jungwon Seo's comment here: #5 (comment) |
Thanks @seungwonpark I think that key checkpoint of #5 are below
Could plz let me know if you have any other clue! Thanks. |
Hello @seungwonpark , I also get the similar problem with @lycox1 . Could you please give me a hand? Since I clone down the newest code, train-other-500 has been removed. By the way, I notice that in README the number of test cases is 1000, while the code use only 100 test cases. Here are the images of the training loss, test loss and test SDR in my experiment. Although the test data may be different, I believe that a correct training loss curve should be similar, right? |
Hi, @lawlict The test loss curve may fluctuate since we didn't perform the evaluation for a sufficient amount of data. So I think the curve may look bit different. |
Dear @seungwonpark
First of all, I would like to thank you for great open source.
I would like to test your nice code and I tried to train voice filter.
But i get the problem with SDR. When i saw SDR graph in voicefilter github,
SDR value from 2 to 10dB. But in my case, SDR value is from -0.8 to 1.2.
I am trying to find the cause of the problem but I can not find it.
Can you help me to find the cause of problem?
I used the default yaml and generator.py. ( train-clean-100, train-clean-360, dev-clean are used
to train)
Could you let me know what i can check?
Thanks you!
The text was updated successfully, but these errors were encountered: