Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about multihead_proj_global #6

Open
liuyueChang opened this issue Dec 16, 2023 · 0 comments
Open

Question about multihead_proj_global #6

liuyueChang opened this issue Dec 16, 2023 · 0 comments

Comments

@liuyueChang
Copy link

liuyueChang commented Dec 16, 2023

Thank for your outstanding work!

  1. global_embed = self.multihead_proj_global(global_embed).view(-1, self.num_modes, self.hidden_size) # [N, F, D]

What is the purpose of this function? Why design this function?

  1. Here

mdn_out = self.Laplacian_Decoder.forward(self.x_encoded_dense, self.hidden_state_global, cn_global, epoch)

the parameters of this function are time tensor x_encoded_dense and spatial tensor hidden_state_global

but in

def forward(self, x_encode: torch.Tensor, hidden_state, cn) -> Tuple[torch.Tensor, torch.Tensor]:

the time tensor's name change to global_embed and the spatial tensor’s name change to local_embed, Is this correct?

Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant