Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dimensionality change #14

Closed
jkamalu opened this issue May 29, 2017 · 1 comment
Closed

Dimensionality change #14

jkamalu opened this issue May 29, 2017 · 1 comment

Comments

@jkamalu
Copy link

jkamalu commented May 29, 2017

In the tacotron paper, the expected dimensionality of the tensor during most of the encoder cbhg module is expected to be batch size x time steps x num features where num features = 128. Why do you shift to 256 before the bidirectional GRU layer? It seems like this would result in a real loss of information during encoding. Is this somehow what is described in the paper and I am just missing it? Thanks

@Spotlight0xff
Copy link
Contributor

Yes, I noticed that as well.
I believe I fixed that with my PR.
Look again at the code, seems fine to me now.

@jkamalu jkamalu closed this as completed Jul 15, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants