Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Write w2v-BERT pretraining recipe #313

Open
cbalioglu opened this issue Feb 7, 2024 · 2 comments
Open

Write w2v-BERT pretraining recipe #313

cbalioglu opened this issue Feb 7, 2024 · 2 comments
Assignees
Milestone

Comments

@cbalioglu
Copy link
Contributor

As the second recipe after NLLB, write the w2v-BERT (and wav2vec2) pretraining recipe for users to check out. This will likely branch to several subtasks once we start working on it.

@cbalioglu cbalioglu self-assigned this Feb 7, 2024
@cbalioglu cbalioglu added this to the Q1'24 milestone Feb 7, 2024
@seastar105
Copy link

seastar105 commented Mar 6, 2024

@cbalioglu any progress on pre-train or finetune w2v bert recipe?

@kdcyberdude
Copy link

Hi @cbalioglu, Any update on this... We want to do the continual training of w2v-bert on specific Indic low-resource languages on audio-only data. Any suggestion on how should we approach this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants