Skip to content
This repository has been archived by the owner on Jan 15, 2024. It is now read-only.

Commit

Permalink
[BUGFIX] [DOC] Update nlp.model.get_model documentation and get_model…
Browse files Browse the repository at this point in the history
… API (#734)

* Improve gluonnlp.model docs

* Fix nlp.model.get_model API

* Update model.rst
  • Loading branch information
leezu authored and eric-haibin-lin committed Jun 3, 2019
1 parent 698b627 commit 5c9fdd1
Show file tree
Hide file tree
Showing 2 changed files with 53 additions and 10 deletions.
58 changes: 51 additions & 7 deletions docs/api/modules/model.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,24 +6,45 @@ all requested pre-trained weights are downloaded from public repo and stored in

.. currentmodule:: gluonnlp.model

Model Registry
--------------

The model registry provides an easy interface to obtain pre-defined and pre-trained models.

.. autosummary::
:nosignatures:

get_model

The `get_model` function returns a pre-defined model given the name of a
registered model. The following sections of this page present a list of
registered names for each model category.

Language Modeling
-----------------

Components

.. autosummary::
:nosignatures:

awd_lstm_lm_1150
awd_lstm_lm_600
AWDRNN
BiLMEncoder
LSTMPCellWithClip
StandardRNN
BigRNN

Pre-defined models

.. autosummary::
:nosignatures:

awd_lstm_lm_1150
awd_lstm_lm_600
standard_lstm_lm_200
standard_lstm_lm_650
standard_lstm_lm_1500
big_rnn_lm_2048_512
StandardRNN
get_model
BigRNN

Machine Translation
-------------------
Expand All @@ -35,11 +56,17 @@ Machine Translation
TransformerEncoder
TransformerEncoderCell
PositionwiseFFN

.. autosummary::
:nosignatures:

transformer_en_de_512

Bidirectional Encoder Representations from Transformers
-------------------------------------------------------

Components

.. autosummary::
:nosignatures:

Expand All @@ -48,26 +75,43 @@ Bidirectional Encoder Representations from Transformers
BERTEncoder
BERTEncoderCell
BERTPositionwiseFFN

Pre-defined models

.. autosummary::
:nosignatures:

bert_12_768_12
bert_24_1024_16

Convolutional Encoder
----------------------
---------------------

.. autosummary::
:nosignatures:

ConvolutionalEncoder

ELMo
----------------------
----

Components

.. autosummary::
:nosignatures:

ELMoBiLM
ELMoCharacterEncoder

Pre-defined models

.. autosummary::
:nosignatures:

elmo_2x1024_128_2048cnn_1xhighway
elmo_2x2048_256_2048cnn_1xhighway
elmo_2x4096_512_2048cnn_2xhighway

Highway Network
-----------------

Expand Down
5 changes: 2 additions & 3 deletions src/gluonnlp/model/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -96,14 +96,14 @@
seq2seq_encoder_decoder.__all__ + transformer.__all__ + bert.__all__


def get_model(name, dataset_name='wikitext-2', **kwargs):
def get_model(name, **kwargs):
"""Returns a pre-defined model by name.
Parameters
----------
name : str
Name of the model.
dataset_name : str or None, default 'wikitext-2'.
dataset_name : str or None, default None
The dataset name on which the pre-trained model is trained.
For language model, options are 'wikitext-2'.
For ELMo, Options are 'gbw' and '5bw'.
Expand Down Expand Up @@ -147,5 +147,4 @@ def get_model(name, dataset_name='wikitext-2', **kwargs):
raise ValueError(
'Model %s is not supported. Available options are\n\t%s'%(
name, '\n\t'.join(sorted(models.keys()))))
kwargs['dataset_name'] = dataset_name
return models[name](**kwargs)

0 comments on commit 5c9fdd1

Please sign in to comment.