You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
defforward(self, prev_samples, upper_tier_conditioning, hidden):
(batch_size, _, _) =prev_samples.size()
# (batch, seq_len, dim) -> (batch, dim, seq_len)input=prev_samples.permute(0, 2, 1)
# (batch, dim, seq_len)# use conv1d instead of FC for speedinput=self.input_expand(input)
# (batch, dim, seq_len) -> (batch, seq_len, dim)input=input.permute(0, 2, 1)
# add conditioning tier from previous frame ifupper_tier_conditioningisnotNone:
input+=upper_tier_conditioning# reset hidden state for TBPTTreset=hiddenisNoneifhiddenisNone:
(n_rnn, _) =self.h0.size()
hidden=self.h0.unsqueeze(1) \
.expand(n_rnn, batch_size, self.dim) \
.contiguous()
# -
(output, hidden) =self.rnn(input, hidden)
# permute again so this can upsample for next contextoutput=output.permute(0, 2, 1)
output=self.upsampling(output)
output=output.permute(0, 2, 1)
return (output, hidden)
are the comments I added correct?
I'd like to just use the Linear layer instead of the Conv1d first for understanding purposes. However, the dimensions don't line up when I do it that way. Any thoughts on how to reframe this in terms of a Linear layer?
I assume the transposes you do are so that the convolutions work out? is that standard when using Conv1d instead of Linear layer?
The text was updated successfully, but these errors were encountered:
In the FrameLevel forward you guys do:
are the comments I added correct?
I'd like to just use the Linear layer instead of the Conv1d first for understanding purposes. However, the dimensions don't line up when I do it that way. Any thoughts on how to reframe this in terms of a Linear layer?
I assume the transposes you do are so that the convolutions work out? is that standard when using Conv1d instead of Linear layer?
The text was updated successfully, but these errors were encountered: