New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question about the usage of helper in TransformerDecoder #228
Comments
Thank you for your interest in Texar-PyTorch! The forward function for every step (here) is implemented in You can take a look at self._inputs_to_outputs, where we have |
In fact, I have no idea of how the initialize/step function is called or works. And the step function requires the helper argument. What should I do to build such class of helper? Is there any corresponding example for this? Thanks! |
Sorry but I'm a bit confused. Is your goal to write (from scratch) a new |
@huzecong Specifically, I want to modify an alternative helper (e.g. TrainingHelper) instead of writing (from scratch). So I may need a corresponding example to help me to understand and then modify. |
You don't need to do that yourself. The flow of execution would be:
This may not apply to every decoder--helper pair but is a general description of how things work. |
Thanks for your detailed instruction! |
Hi~
I want to implement the step-by-step TransformerDecoder with a TrainingHelper(), but I don't know how to call the same forward function as the RNN's, e.g.
outputs, hidden = self.gru(embedded, hidden)
# forward for every stepHas it been done in the step method of the class helper ? Hope for your help!
The text was updated successfully, but these errors were encountered: