Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How do we use the pretrained attention aligner? #29

Open
vishhvak opened this issue Nov 2, 2022 · 1 comment
Open

How do we use the pretrained attention aligner? #29

vishhvak opened this issue Nov 2, 2022 · 1 comment

Comments

@vishhvak
Copy link

vishhvak commented Nov 2, 2022

Hi, I find that getting a pretrained predictive aligner (aligner='charsiu/en_w2v2_fc_10ms') to work with librispeech seems straightforward. However, I'm unable to get the attention aligner working - how do I go about initializing the aligner and how do I get the corresponding bert config to go with it? Keeps throwing an error for the same.

@vishhvak vishhvak changed the title How do we use the attention aligner? How do we use the pretrained attention aligner? Nov 2, 2022
@lingjzhu
Copy link
Owner

lingjzhu commented Nov 3, 2022

What type of error did you get? Did you try our colab tutorial? There are some example scripts for loading the model.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants