Skip to content

Commit

Permalink
merge master
Browse files Browse the repository at this point in the history
  • Loading branch information
Ghostvv committed Dec 12, 2018
2 parents 25ad13f + e790177 commit 9ba15d5
Show file tree
Hide file tree
Showing 3 changed files with 80 additions and 5 deletions.
1 change: 1 addition & 0 deletions CHANGELOG.rst
Expand Up @@ -11,6 +11,7 @@ Added
-----
- environment variables specified with ``${env_variable}`` in a yaml
configuration file are now replaced with the value of the environment variable
- more documentation on how to run NLU with Docker
- ``analyzer`` parameter to ``intent_featurizer_count_vectors`` featurizer to
configure whether to use word or character n-grams

Expand Down
5 changes: 5 additions & 0 deletions docs/choosing_pipeline.rst
Expand Up @@ -201,6 +201,8 @@ a full list of components. For example, these two configurations are equivalent:
Below is a list of all the pre-configured pipeline templates.

.. _section_spacy_pipeline:

spacy_sklearn
~~~~~~~~~~~~~

Expand All @@ -225,6 +227,8 @@ the components and configure them separately:
- name: "ner_synonyms"
- name: "intent_classifier_sklearn"
.. _section_tensorflow_embedding_pipeline:

tensorflow_embedding
~~~~~~~~~~~~~~~~~~~~

Expand Down Expand Up @@ -253,6 +257,7 @@ default is to use a simple whitespace tokenizer:
If you have a custom tokenizer for your language, you can replace the whitespace
tokenizer with something more accurate.

.. _section_mitie_pipeline:

mitie
~~~~~
Expand Down
79 changes: 74 additions & 5 deletions docs/docker.rst
@@ -1,18 +1,87 @@
:desc: Using Rasa NLU with Docker

.. _section_docker:

Running in Docker
=================

Rasa NLU docker images are provided for different backends:
.. contents::

Images
------

`Rasa NLU docker images <https://hub.docker.com/r/rasa/rasa_nlu/tags/>`_ are provided for different backends:

- ``spacy``: If you use the :ref:`section_spacy_pipeline` pipeline
- ``tensorflow``: If you use the :ref:`section_tensorflow_embedding_pipeline`
pipeline
- ``mitie``: If you use the :ref:`section_mitie_pipeline` pipeline
- ``bare``: If you want to take a base image and enhance it with your custom
dependencies
- ``full`` (default): If you use components from different pre-defined pipelines
and want to have everything included in the image.

.. note::

For the ``tensorflow`` and the ``full`` image a x86_64 CPU with AVX support
is required.


Training NLU
------------

To train a NLU model you need to mount two directories into the Docker container:

- a directory containing your project which in turn includes your NLU
configuration and your NLU training data
- a directory which will contain the trained NLU model

The host must have a x86_64 CPU with AVX support.
.. code-block:: shell
docker run \
-v <project_directory>:/app/project \
-v <model_output_directory>:/app/model \
rasa/rasa_nlu:latest \
run \
python -m rasa_nlu.train \
-c /app/project/<nlu configuration>.yml \
-d /app/project/<nlu data> \
-o /app/model \
--project <nlu project name>
Running NLU with Rasa Core
--------------------------

See this `guide <https://rasa.com/docs/core/docker_walkthrough/>`_ which
describes how to set up all Rasa components as Docker containers and how to
connect them.


Running NLU as Standalone Server
---------------------

To run NLU as server you have to

- mount a directory with the trained NLU models
- expose a port

.. code-block:: bash
docker run -p 5000:5000 rasa/rasa_nlu:latest-full
docker run \
-p 5000:5000 \
-v <directory with nlu models>:/app/projects \
rasa/rasa_nlu:latest \
start \
--path /app/projects
--port 5000
You can then send requests to your NLU server as it is described in the
:ref:`section_http`, e.g. if it is running on the localhost:

.. code-block:: bash
curl --request GET \
--url 'http://localhost:5000/parse?q=Hello%20world!'
.. include:: feedback.inc

.. include:: feedback.inc

0 comments on commit 9ba15d5

Please sign in to comment.