Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[examples] whisper aishell #2238

Merged
merged 6 commits into from
Dec 14, 2023
Merged

[examples] whisper aishell #2238

merged 6 commits into from
Dec 14, 2023

Conversation

xingchensong
Copy link
Member

No description provided.

@xingchensong
Copy link
Member Author

xingchensong commented Dec 14, 2023

Training time of 40 epochs (After 40 epochs, the cv loss no longer decreases.)

image

Mddct
Mddct previously approved these changes Dec 14, 2023
@robin1001 robin1001 merged commit d39af58 into main Dec 14, 2023
5 of 6 checks passed
@robin1001 robin1001 deleted the xcsong-whisper-aishell branch December 14, 2023 06:17
@srdfjy
Copy link
Contributor

srdfjy commented Dec 19, 2023

Hi @xingchensong , can the 16GB V100 be used for training?

@xingchensong
Copy link
Member Author

Hi @xingchensong , can the 16GB V100 be used for training?

8 * V100? V100 dose not support BF16, but I think it's ok to train fp32 model under deepspeed stage-3

@srdfjy
Copy link
Contributor

srdfjy commented Dec 20, 2023

Yes, 8 * V100, each card with 16GB memory. Can training be done with 16GB memory?

@xingchensong
Copy link
Member Author

Yes, 8 * V100, each card with 16GB memory. Can training be done with 16GB memory?

even 8 * 2080ti works, please follow this setting #2173 (comment)

@srdfjy
Copy link
Contributor

srdfjy commented Dec 20, 2023

Yes, 8 * V100, each card with 16GB memory. Can training be done with 16GB memory?

even 8 * 2080ti works, please follow this setting #2173 (comment)

THX!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants