Skip to content

Commit

Permalink
Fixing typo in function name and updating README
Browse files Browse the repository at this point in the history
  • Loading branch information
jacobdevlin-google committed Nov 5, 2018
1 parent 2f82f21 commit e13c1f3
Show file tree
Hide file tree
Showing 6 changed files with 27 additions and 16 deletions.
24 changes: 19 additions & 5 deletions README.md
@@ -1,5 +1,15 @@
# BERT

**\*\*\*\*\* New November 5th, 2018: Third-party PyTorch version of BERT
available \*\*\*\*\***

NLP researchers from HuggingFace made a
[PyTorch version of BERT available](https://github.com/huggingface/pytorch-pretrained-BERT)
which is compatible with our pre-trained checkpoints and is able to reproduce
our results. (Thanks!) We were not involved in the creation or maintenance of
the PyTorch implementation so please direct any questions towards the authors of
that repository.

**\*\*\*\*\* New November 3rd, 2018: Multilingual and Chinese models available
\*\*\*\*\***

Expand Down Expand Up @@ -63,8 +73,8 @@ minutes.

## What is BERT?

BERT is method of pre-training language representations, meaning that we train a
general-purpose "language understanding" model on a large text corpus (like
BERT is a method of pre-training language representations, meaning that we train
a general-purpose "language understanding" model on a large text corpus (like
Wikipedia), and then use that model for downstream NLP tasks that we care about
(like question answering). BERT outperforms previous methods because it is the
first *unsupervised*, *deeply bidirectional* system for pre-training NLP.
Expand Down Expand Up @@ -778,9 +788,13 @@ information.

#### Is there a PyTorch version available?

There is no official PyTorch implementation. If someone creates a line-for-line
PyTorch reimplementation so that our pre-trained checkpoints can be directly
converted, we would be happy to link to that PyTorch version here.
There is no official PyTorch implementation. However, NLP researchers from
HuggingFace made a
[PyTorch version of BERT available](https://github.com/huggingface/pytorch-pretrained-BERT)
which is compatible with our pre-trained checkpoints and is able to reproduce
our results. We were not involved in the creation or maintenance of the PyTorch
implementation so please direct any questions towards the authors of that
repository.

#### Will models in other languages be released?

Expand Down
2 changes: 1 addition & 1 deletion extract_features.py
Expand Up @@ -170,7 +170,7 @@ def model_fn(features, labels, mode, params): # pylint: disable=unused-argument

tvars = tf.trainable_variables()
scaffold_fn = None
(assignment_map, _) = modeling.get_assigment_map_from_checkpoint(
(assignment_map, _) = modeling.get_assignment_map_from_checkpoint(
tvars, init_checkpoint)
if use_tpu:

Expand Down
2 changes: 1 addition & 1 deletion modeling.py
Expand Up @@ -315,7 +315,7 @@ def get_activation(activation_string):
raise ValueError("Unsupported activation: %s" % act)


def get_assigment_map_from_checkpoint(tvars, init_checkpoint):
def get_assignment_map_from_checkpoint(tvars, init_checkpoint):
"""Compute the union of the current variables and checkpoint variables."""
assignment_map = {}
initialized_variable_names = {}
Expand Down
5 changes: 2 additions & 3 deletions run_classifier.py
Expand Up @@ -571,9 +571,8 @@ def model_fn(features, labels, mode, params): # pylint: disable=unused-argument

scaffold_fn = None
if init_checkpoint:
(assignment_map,
initialized_variable_names) = modeling.get_assigment_map_from_checkpoint(
tvars, init_checkpoint)
(assignment_map, initialized_variable_names
) = modeling.get_assignment_map_from_checkpoint(tvars, init_checkpoint)
if use_tpu:

def tpu_scaffold():
Expand Down
5 changes: 2 additions & 3 deletions run_pretraining.py
Expand Up @@ -152,9 +152,8 @@ def model_fn(features, labels, mode, params): # pylint: disable=unused-argument
initialized_variable_names = {}
scaffold_fn = None
if init_checkpoint:
(assignment_map,
initialized_variable_names) = modeling.get_assigment_map_from_checkpoint(
tvars, init_checkpoint)
(assignment_map, initialized_variable_names
) = modeling.get_assignment_map_from_checkpoint(tvars, init_checkpoint)
if use_tpu:

def tpu_scaffold():
Expand Down
5 changes: 2 additions & 3 deletions run_squad.py
Expand Up @@ -576,9 +576,8 @@ def model_fn(features, labels, mode, params): # pylint: disable=unused-argument
initialized_variable_names = {}
scaffold_fn = None
if init_checkpoint:
(assignment_map,
initialized_variable_names) = modeling.get_assigment_map_from_checkpoint(
tvars, init_checkpoint)
(assignment_map, initialized_variable_names
) = modeling.get_assignment_map_from_checkpoint(tvars, init_checkpoint)
if use_tpu:

def tpu_scaffold():
Expand Down

0 comments on commit e13c1f3

Please sign in to comment.