Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mixed Precision Training #203

Open
Zadagu opened this issue May 17, 2022 · 1 comment
Open

Mixed Precision Training #203

Zadagu opened this issue May 17, 2022 · 1 comment

Comments

@Zadagu
Copy link

Zadagu commented May 17, 2022

Hi,

first of all I want to thank you for your great work. I'm using SRUs for speech enhancement, they do very well on a reasonable computational cost.

I would like to know if there is a possibility to train SRUs in mixed precision mode?
I tried to enable it, by setting precision=16 in the pytorch lightning trainer, but that didn't do the trick.

Kind of regards,
Zadagu

@taolei87
Copy link
Contributor

taolei87 commented Jun 7, 2022

@Zadagu SRU / SRU++ can work with pytorch native mixed precision training.

See this for example:
https://github.com/asappresearch/sru/blob/3.0.0-dev/experiments/srupp_experiments/train_enwik8.py#L250

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants