Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A few question on training schemes #6

Closed
hyli666 opened this issue Dec 3, 2019 · 2 comments
Closed

A few question on training schemes #6

hyli666 opened this issue Dec 3, 2019 · 2 comments

Comments

@hyli666
Copy link

hyli666 commented Dec 3, 2019

Hi, Jason

Thank you very much for sharing the codes. I really like your paper, it's quite flexible and efficient to optimize some un-differentiable measures using your proposed scheme. Regarding to the detailed training procedures, I have several questions below:

(1) Did you train enhancement model (G), using PESQ measure (D) directly? I mean, did you add some additional loss, like MSE, or only used PESQ measure alone, in training stage?
(2) when training D, I found you trained it using so-called "previous list". It seems optional, I would like to know whether this stage is crucial for getting a better result?
(3) In the released codes, G and D are trained alternately for num_sampling=100 steps in one epoch. And batch_size used is equal to 1. I am wondering whether these hyper-parameters are same with your recipe, to get the Table.2 results?

Sorry to ask so many questions. Thank you again and wish you good works in future!

@JasonSWFu
Copy link
Owner

Hi,

  1. For the training enhancement model (G), we only use PESQ measure (D) directly.

  2. It can provide a better result, but it's not very crucial.

  3. Yes. However, as mentioned in the paper, the input features and activation functions used in table 2 are different from those provided here.

@hyli666
Copy link
Author

hyli666 commented Dec 5, 2019

Thanks, it really helps me a lot.

@hyli666 hyli666 closed this as completed Dec 5, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants