Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add GPT2Decoder #228

Merged
merged 3 commits into from
Nov 6, 2019
Merged

Add GPT2Decoder #228

merged 3 commits into from
Nov 6, 2019

Conversation

gpengzhi
Copy link
Collaborator

No description provided.

Copy link
Member

@ZhitingHu ZhitingHu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The remaining thing after GPT2Decoder is to refactor all decoder modules to have the same interface a/ Texar-Pytorch, i.e., accepting token_embedder and token_pos_embedder arguments in the constructor.

This can be done after the Forte example.

texar/tf/modules/pretrained/gpt2.py Show resolved Hide resolved
texar/tf/modules/decoders/gpt2_decoder.py Show resolved Hide resolved
]


class GPT2Decoder(PretrainedGPT2Mixin):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We probably have discussed this before. GPT2Decoder should inherit TFDecoder

@gpengzhi gpengzhi merged commit c8f452d into asyml:master Nov 6, 2019
@gpengzhi gpengzhi deleted the gpt2decoder branch February 12, 2020 22:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants