Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Are there plans to supplement the code on the CSL dataset? #17

Closed
HW140701 opened this issue Jun 8, 2022 · 6 comments
Closed

Are there plans to supplement the code on the CSL dataset? #17

HW140701 opened this issue Jun 8, 2022 · 6 comments

Comments

@HW140701
Copy link

HW140701 commented Jun 8, 2022

Thank you very much for your contribution to the community.
In the paper, I saw that experiments were carried out on both the PHOENIX14 dataset and the CSL dataset. I would like to ask if there are plans to supplement the data processing part and the training part of the code on the CSL dataset?

@ycmin95
Copy link
Collaborator

ycmin95 commented Jun 9, 2022

Thanks for your attention, we do not have clear plan to release relevant code on the CSL dataset, perhaps after the publication of the journal version. The total data process and training part are the same on different datasets (details can be found in the paper), and there is an evaluation trick on CSL due to its signer-independent setting: from my experience, using model.train() achieve better performance than model.eval().

@HW140701
Copy link
Author

HW140701 commented Jun 9, 2022

Thank you very much for your reply, looking forward to the new paper.
I will verify the CSL dataset with reference to the details mentioned in the paper.

@ycmin95
Copy link
Collaborator

ycmin95 commented Jun 9, 2022

You can post issues if you meet any problems during implementation. Good luck :)

@HW140701
Copy link
Author

HW140701 commented Jun 9, 2022

Ok.
Thanks a lot for your help.

@hulianyuyy
Copy link

I wonder if we should use 'model.train()' when evaluation. As far as i know, 'model.train()' enables backward-gradient and some special components (e.g. dropout), but seems to make no substantial change for VAC.

@ycmin95
Copy link
Collaborator

ycmin95 commented Jun 10, 2022

It is about the statistic used by BN.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants