Skip to content

Commit

Permalink
added save_directories for _psave_pretrained_pt and _tf, changed mod…
Browse files Browse the repository at this point in the history
…el to tf_model and pt_model, enable the notebook to run cleanly from top to bottom without error (#14529)

* added save_directories for _psave_pretrained_pt and _tf, changed model to tf_model and pt_model, enable the notebook to run cleanly from top to bottom without error

* Update quicktour.rst

* added >>>

* dependencies

* added space
  • Loading branch information
Chris Fregly authored Nov 26, 2021
1 parent 04683c0 commit 1bbd6fc
Showing 1 changed file with 24 additions and 8 deletions.
32 changes: 24 additions & 8 deletions docs/source/quicktour.rst
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,15 @@ The easiest way to use a pretrained model on a given task is to use :func:`~tran
Let's see how this work for sentiment analysis (the other tasks are all covered in the :doc:`task summary
</task_summary>`):

Install the following dependencies (if not already installed):

.. code-block::
>>> pip install torch
>>> pip install tensorflow
>>> pip install transformers
>>> pip install datasets
.. code-block::
>>> from transformers import pipeline
Expand Down Expand Up @@ -337,8 +346,15 @@ Once your model is fine-tuned, you can save it with its tokenizer in the followi

.. code-block::
tokenizer.save_pretrained(save_directory)
model.save_pretrained(save_directory)
>>> pt_save_directory = './pt_save_pretrained'
>>> tokenizer.save_pretrained(pt_save_directory)
>>> pt_model.save_pretrained(pt_save_directory)
.. code-block::
>>> tf_save_directory = './tf_save_pretrained'
>>> tokenizer.save_pretrained(tf_save_directory)
>>> tf_model.save_pretrained(tf_save_directory)
You can then load this model back using the :func:`~transformers.AutoModel.from_pretrained` method by passing the
directory name instead of the model name. One cool feature of 🤗 Transformers is that you can easily switch between
Expand All @@ -347,17 +363,17 @@ loading a saved PyTorch model in a TensorFlow model, use :func:`~transformers.TF

.. code-block::
from transformers import TFAutoModel
tokenizer = AutoTokenizer.from_pretrained(save_directory)
model = TFAutoModel.from_pretrained(save_directory, from_pt=True)
>>> from transformers import TFAutoModel
>>> tokenizer = AutoTokenizer.from_pretrained(pt_save_directory)
>>> tf_model = TFAutoModel.from_pretrained(pt_save_directory, from_pt=True)
and if you are loading a saved TensorFlow model in a PyTorch model, you should use the following code:

.. code-block::
from transformers import AutoModel
tokenizer = AutoTokenizer.from_pretrained(save_directory)
model = AutoModel.from_pretrained(save_directory, from_tf=True)
>>> from transformers import AutoModel
>>> tokenizer = AutoTokenizer.from_pretrained(tf_save_directory)
>>> pt_model = AutoModel.from_pretrained(tf_save_directory, from_tf=True)
Lastly, you can also ask the model to return all hidden states and all attention weights if you need them:

Expand Down

0 comments on commit 1bbd6fc

Please sign in to comment.