Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Seq2seq #989

Merged
merged 13 commits into from
Jun 2, 2019
Merged

Seq2seq #989

merged 13 commits into from
Jun 2, 2019

Conversation

ArnoldLIULJ
Copy link
Member

ADD Stacked Layer Seq2Seq

decoder_seq_length: int
The length of your target sequence
cell_enc : str, tf.function
The RNN function cell for your encoder stack, i.e. tf.keras.layers.GRUCell
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

'i.e.' means 'in other words'. Please use 'e.g.' (means 'for example') here and other places accordingly.

n_layer : int
The number of your RNN layers for both encoder and decoder block
embbedding_layer : tl.function
The embedding layer function, i.e. tl.layers.Embedding(vocabulary_size=voc_size, embedding_size=emb_dim)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Typo here and please change this comment to:

embedding_layer : tl.Layer
        A embedding layer, e.g. tl.layers.Embedding(vocabulary_size=voc_size, embedding_size=emb_dim)

The number of your RNN layers for both encoder and decoder block
embbedding_layer : tl.function
The embedding layer function, i.e. tl.layers.Embedding(vocabulary_size=voc_size, embedding_size=emb_dim)
is_train : bool
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is_train is never used in init and it is not going to be supported when a model is constructed in future versions. Please remove it.

n_layer=3,
embedding_layer=None,
is_train=True,
name="seq2seq_"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

name=None by default. There is an auto-naming mechanism in tl.models.Model().

self.reshape_layer_individual_sequence = tl.layers.Reshape([-1, 1, self.vocabulary_size])

def inference(self, encoding, seq_length, start_token, top_n):

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some documentation is needed here.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did you use beam search? Or just random sampling?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Random sampling

Copy link
Member

@JingqingZ JingqingZ left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please resolve the problems.

@zsdonghao zsdonghao merged commit 0fe70a7 into master Jun 2, 2019
@zsdonghao zsdonghao deleted the seq2seq branch June 2, 2019 04:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants