You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on May 28, 2019. It is now read-only.
when you give input to the model, do you give one npz to the model at a time?
It seems
train() is called for every epoch
"for full_txt, full_feat, spkr in train_enum" is called for every batch
"for txt, feat, spkr, start in batch_iter:" is called for every npz
==> model.forward() is called here then the loss is sumed up for batch
Then what is the meaning of having batch at all when model is not called batch-wise?
The text was updated successfully, but these errors were encountered:
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
when you give input to the model, do you give one npz to the model at a time?
It seems
==> model.forward() is called here then the loss is sumed up for batch
Then what is the meaning of having batch at all when model is not called batch-wise?
The text was updated successfully, but these errors were encountered: