Skip to content

Commit

Permalink
update docs; CHANGELOG
Browse files Browse the repository at this point in the history
  • Loading branch information
ZhitingHu committed Apr 9, 2019
1 parent 1ff01fe commit 251e8cc
Show file tree
Hide file tree
Showing 3 changed files with 72 additions and 5 deletions.
5 changes: 5 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,11 @@

### Feature improvements

* Refactor `TransformerEncoder` and `TransformerDecoder` to separate position embeddings from the modules. ([#126](https://github.com/asyml/texar/pull/126))
* Allow passing a Tensor to `output_layer` of decoders' constructors -- used for weight tie b/w the output layer and input embedding matrix. ([#126](https://github.com/asyml/texar/pull/126))
* `TransformerDecoder` constructor interface made exact the same with `RNN decoders` constructor interfaces. ([#126](https://github.com/asyml/texar/pull/126))
* Refactor decoder `Helper`s to allow two-argument `embedding_fn` (for position embedding). ([#126](https://github.com/asyml/texar/pull/126))

### Fixes

## [v0.2.0](https://github.com/asyml/texar/releases/tag/v0.2.0) (2019-04-09)
Expand Down
40 changes: 40 additions & 0 deletions docs/code/modules.rst
Original file line number Diff line number Diff line change
Expand Up @@ -129,6 +129,11 @@ Decoders
.. autoclass:: texar.modules.TransformerDecoderOutput
:members:

:hidden:`Helper`
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: texar.modules.Helper
:members:

:hidden:`TopKSampleEmbeddingHelper`
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: texar.modules.TopKSampleEmbeddingHelper
Expand All @@ -144,6 +149,41 @@ Decoders
.. autoclass:: texar.modules.GumbelSoftmaxEmbeddingHelper
:members:

:hidden:`TrainingHelper`
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: texar.modules.Helper
:members:

:hidden:`ScheduledEmbeddingTrainingHelper`
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: texar.modules.SampleEmbeddingHelper
:members:

:hidden:`ScheduledOutputTrainingHelper`
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: texar.modules.ScheduledOutputTrainingHelper
:members:

:hidden:`GreedyEmbeddingHelper`
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: texar.modules.GreedyEmbeddingHelper
:members:

:hidden:`SampleEmbeddingHelper`
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: texar.modules.SampleEmbeddingHelper
:members:

:hidden:`InferenceHelper`
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: texar.modules.InferenceHelper
:members:

:hidden:`CustomHelper`
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autoclass:: texar.modules.CustomHelper
:members:

:hidden:`get_helper`
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
.. autofunction:: texar.modules.get_helper
Expand Down
32 changes: 27 additions & 5 deletions texar/modules/decoders/tf_helpers.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,9 @@
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""A library of helpers for use with SamplingDecoders.
# Modifications copyright (C) 2019 Texar
# ==============================================================================
"""A library of helpers for use with Texar RNN/Transformer decoders.
Adapted from the `tensorflow.contrib.seq2seq` package.
"""
Expand Down Expand Up @@ -67,9 +69,11 @@ def _unstack_ta(inp):

@six.add_metaclass(abc.ABCMeta)
class Helper(object):
"""Interface for implementing sampling in seq2seq decoders.
"""Interface for implementing different decoding strategies in
:class:`RNN decoders <texar.modules.RNNDecoderBase>` and
:class:`Transformer decoder <texar.modules.TransformerDecoder>`.
Helper instances are used by `BasicDecoder`.
Adapted from the `tensorflow.contrib.seq2seq` package.
"""

@abc.abstractproperty
Expand Down Expand Up @@ -113,7 +117,7 @@ def next_inputs(self, time, outputs, state, sample_ids, name=None):


class CustomHelper(Helper):
"""Base abstract class that allows the user to customize sampling."""
"""Base abstract class that allows the user to customize decoding."""

def __init__(self, initialize_fn, sample_fn, next_inputs_fn,
sample_ids_shape=None, sample_ids_dtype=None):
Expand Down Expand Up @@ -172,9 +176,15 @@ def next_inputs(self, time, outputs, state, sample_ids, name=None):


class TrainingHelper(Helper):
"""A helper for use during training. Only reads inputs.
"""A helper for use during training. Performs teacher-forcing decoding.
Returned sample_ids are the argmax of the RNN output logits.
Note that for teacher-forcing decoding, Texar's decoders provide a simpler
interface by specifying `decoding_strategy='train_greedy'` when calling a
decoder (see, e.g.,,
:meth:`RNN decoder <texar.modules.RNNDecoderBase._build>`). In this case,
use of TrainingHelper is not necessary.
"""

def __init__(self, inputs, sequence_length, time_major=False, name=None):
Expand Down Expand Up @@ -522,6 +532,12 @@ class GreedyEmbeddingHelper(Helper):
Uses the argmax of the output (treated as logits) and passes the
result through an embedding layer to get the next input.
Note that for greedy decoding, Texar's decoders provide a simpler
interface by specifying `decoding_strategy='infer_greedy'` when calling a
decoder (see, e.g.,,
:meth:`RNN decoder <texar.modules.RNNDecoderBase._build>`). In this case,
use of GreedyEmbeddingHelper is not necessary.
"""

def __init__(self, embedding, start_tokens, end_token):
Expand Down Expand Up @@ -627,6 +643,12 @@ class SampleEmbeddingHelper(GreedyEmbeddingHelper):
Uses sampling (from a distribution) instead of argmax and passes the
result through an embedding layer to get the next input.
Note that for sample decoding, Texar's decoders provide a simpler
interface by specifying `decoding_strategy='infer_sample'` when calling a
decoder (see, e.g.,,
:meth:`RNN decoder <texar.modules.RNNDecoderBase._build>`). In this case,
use of SampleEmbeddingHelper is not necessary.
"""

def __init__(self, embedding, start_tokens, end_token,
Expand Down

0 comments on commit 251e8cc

Please sign in to comment.