Skip to content
This repository has been archived by the owner on Aug 22, 2019. It is now read-only.

Commit

Permalink
Merge branch 'master' into inter_entity
Browse files Browse the repository at this point in the history
  • Loading branch information
tmbo committed Mar 15, 2019
2 parents d442bc5 + aac9f1b commit 8d17264
Show file tree
Hide file tree
Showing 57 changed files with 3,006 additions and 284 deletions.
4 changes: 4 additions & 0 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,10 @@ jobs:
- git clone https://github.com/RasaHQ/starter-pack-rasa-stack.git
- cd starter-pack-rasa-stack
- python -m pytest tests/test_core.py
- stage: test
name: "Test CLI"
script:
- timeout 2 time rasa --help
- stage: integration
name: "Test Docs"
install:
Expand Down
4 changes: 4 additions & 0 deletions CHANGELOG.rst
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ Added
- added ability to use multiple env vars per line in yaml files
- added ``priority`` property of policies to influence best policy in
the case of equal confidence
- added rasa command line interface and API

Changed
-------
Expand All @@ -39,6 +40,9 @@ Changed
- renamed ``policy_metadata.json`` to ``metadata.json`` for persisted models
- ``scores`` array returned by the ``/conversations/{sender_id}/predict``
endpoint is now sorted according to the actions' scores.
- now randomly created augmented stories are subsampled during training and marked,
so that memo policies can ignore them
- changed payloads from "text" to "message" in files: server.yml, docs/connectors.rst, rasa_core/server.py, rasa_core/training/interactive.py, tests/test_interactive.py
- changed payloads from "text" to "message" in files: server.yml, docs/connectors.rst,
rasa_core/server.py, rasa_core/training/interactive.py, tests/test_interactive.py
- dialogue files in ``/data/test_dialogues`` were updated with conversations
Expand Down
402 changes: 201 additions & 201 deletions LICENSE.txt

Large diffs are not rendered by default.

5 changes: 4 additions & 1 deletion MANIFEST.in
Original file line number Diff line number Diff line change
@@ -1 +1,4 @@
include LICENSE.txt README.md requirements.txt dev-requirements.txt rasa_core/schemas/* rasa_core/training/visualization.html rasa_core/default_config.yml
include LICENSE.txt README.md requirements.txt dev-requirements.txt
include rasa_core/schemas/* rasa_core/training/visualization.html
include rasa_core/default_config.yml
recursive-include rasa/cli/initial_project *
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ To build & edit the docs, first install all necessary dependencies:

```
brew install sphinx
pip install -r dev-requirements.txt
pip3 install -r dev-requirements.txt
```

After the installation has finished, you can run and view the documentation
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
event_broker:
url: localhost
sasl_username: username
sasl_password: password
topic: topic
security_protocol: SASL_PLAINTEXT
type: kafka
9 changes: 9 additions & 0 deletions data/test_endpoints/event_brokers/kafka_ssl_endpoint.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
event_broker:
url: localhost
topic: topic
security_protocol: SSL
ssl_cafile: CARoot.pem
ssl_certfile: certificate.pem
ssl_keyfile: key.pem
ssl_check_hostname: True
type: kafka
199 changes: 167 additions & 32 deletions docs/brokers.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,34 +6,59 @@
Event Brokers
=============

Rasa Core allows you to stream events to a message broker. The
example implementation we're going to show you here uses `Pika <pika.readthedocs.io>`_,
the Python client library for `RabbitMQ <https://www.rabbitmq.com>`_.
Rasa Core allows you to stream events to a message broker. The event broker
emits events into the event queue. It becomes part of the ``TrackerStore``
which you use when starting an ``Agent`` or launch ``rasa_core.run``.

All events are streamed to the broker as serialised dictionaries every time
the tracker updates it state. An example event emitted from the ``default``
tracker looks like this:

.. code-block:: json
{
"sender_id": "default",
"timestamp": 1528402837.617099,
"event": "bot",
"text": "what your bot said",
"data": "some data"
}
The ``event`` field takes the event's ``type_name`` (for more on event
types, check out the :doc:`api/events` docs).

Rasa enables two possible brokers producers: Pika Event Broker and Kafka Event Broker.

The event broker emits events into the event queue. It becomes part of the
``TrackerStore`` which you use when starting an ``Agent`` or launch
``rasa_core.run``.
Pika Event Broker
-----------------

Adding an Event Broker Using the Endpoint Configuration
-------------------------------------------------------
The example implementation we're going to show you here uses `Pika <pika.readthedocs.io>`_,
the Python client library for `RabbitMQ <https://www.rabbitmq.com>`_.

Adding a Pika Event Broker Using the Endpoint Configuration
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

You can use an endpoint configuration file to instruct Rasa Core to stream
all events to your event broker. To do so, add the following section to your
endpoint configuration, e.g. ``endpoints.yml``:

.. literalinclude:: ../data/test_endpoints/event_brokers/pika_endpoint.yml

Then instruct Rasa Core to use the endpoint configuration by adding
``--endpoints <path to your endpoint configuration`` when running it.
Then instruct Rasa Core to use the endpoint configuration and Pika producer by adding
``--event_broker pika_producer`` and ``--endpoints <path to your endpoint configuration`` as following example:

.. code-block:: shell
Adding an Event Broker in Python
--------------------------------
python3 -m rasa_core.run -d models/dialogue -u models/nlu/current --event_broker pika_producer --endpoints endpoints.yml
Adding a Pika Event Broker in Python
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Here is how you add it using Python code:

.. code-block:: python
from rasa_core.broker import PikaProducer
from rasa_core.event_brokers.pika_producer import PikaProducer
from rasa_platform.core.tracker_store import InMemoryTrackerStore
pika_broker = PikaProducer('localhost',
Expand All @@ -44,26 +69,10 @@ Here is how you add it using Python code:
tracker_store = InMemoryTrackerStore(db=db, event_broker=pika_broker)
Implementing an Event Consumer
------------------------------

All events are streamed to RabbitMQ as serialised dictionaries every time
the tracker updates it state. An example event emitted from the ``default``
tracker looks like this:

.. code-block:: json
{
"sender_id": "default",
"timestamp": 1528402837.617099,
"event": "bot",
"text": "what your bot said",
"data": "some data"
}
Implementing a Pika Event Consumer
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

The ``event`` field takes the event's ``type_name`` (for more on event
types, check out the :doc:`api/events` docs). You need to have a RabbitMQ
server running, as well as another application
You need to have a RabbitMQ server running, as well as another application
that consumes the events. This consumer to needs to implement Pika's
``start_consuming()`` method with a ``callback`` action. Here's a simple
example:
Expand Down Expand Up @@ -96,5 +105,131 @@ example:
no_ack=True)
channel.start_consuming()
Kafka Event Broker
------------------

It is possible to use `Kafka <https://kafka.apache.org/>`_ as main broker to you events. In this example
we are going to use the `python-kafka <https://kafka-python.readthedocs.io/en/master/usage.html>`_
library, a Kafka client written in Python.

Adding a Kafka Event Broker Using the Endpoint Configuration
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

As for the other brokers, you can use an endpoint configuration file to instruct Rasa Core to stream
all events to this event broker. To do it, add the following section to your
endpoint configuration.

Pass the ``endpoints.yml`` file as argument with ``--endpoints <path to your endpoint configuration>``
when running Rasa, and select Kafka producer with ``--kafka_broker kafka_producer``, as following example:

.. code-block:: shell
python3 -m rasa_core.run -d models/dialogue -u models/nlu/current --event_broker kafka_producer --endpoints endpoints.yml
Using ``SASL_PLAINTEXT`` protocol the endpoints file must have the following entries:

.. literalinclude:: ../data/test_endpoints/event_brokers/kafka_plaintext_endpoint.yml

In the case of using SSL protocol the endpoints file must looks like:

.. literalinclude:: ../data/test_endpoints/event_brokers/kafka_ssl_endpoint.yml

Adding a Kafka Broker in Python
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

The code below shows an example on how to instantiate a Kafka producer in you script.

.. code-block:: python
from rasa_core.event_brokers.kafka_producer import KafkaProducer
from rasa_platform.core.tracker_store import InMemoryTrackerStore
kafka_broker = KafkaProducer(host='localhost:9092',
topic='rasa_core_events')
tracker_store = InMemoryTrackerStore(event_broker=kafka_broker)
The host variable can be either a list of brokers adresses or a single one.
If only one broker address is available, the client will connect to it and
request the cluster Metadata.
Therefore, the remain brokers in the cluster can be discovered
automatically through the data served by the first connected broker.

To pass more than one broker address as argument, they must be passed in a
list of strings. e.g.:

.. code-block:: python
kafka_broker = KafkaProducer(host=['kafka_broker_1:9092',
'kafka_broker_2:2030',
'kafka_broker_3:9092'],
topic='rasa_core_events')
Authentication and authorization
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Rasa Core's Kafka producer accepts two types of security protocols - ``SASL_PLAINTEXT`` and ``SSL``.

For development environment, or if the brokers servers and clients are located
into the same machine, you can use simple authentication with ``SASL_PLAINTEXT``.
By using this protocol, the credentials and messages exchanged between the clients and servers
will be sent in plaintext. Thus, this is not the most secure approach, but since it's simple
to configure, it is useful for simple cluster configurations.
``SASL_PLAINTEXT`` protocol requires the setup of the ``username`` and ``password``
previously configured in the broker server.

.. code-block:: python
kafka_broker = KafkaProducer(host='kafka_broker:9092',
sasl_plain_username='kafka_username',
sasl_plain_password='kafka_password',
security_protocol='SASL_PLAINTEXT',
topic='rasa_core_events')
If the clients or the brokers in the kafka cluster are located in different
machines, it's important to use ssl protocal to assure encryption of data and client
authentication. After generating valid certificates for the brokers and the
clients, the path to the certificate and key generated for the producer must
be provided as arguments, as well as the CA's root certificate.

.. code-block:: python
kafka_broker = KafkaProducer(host='kafka_broker:9092',
ssl_cafile='CARoot.pem',
ssl_certfile='certificate.pem',
ssl_keyfile='key.pem',
ssl_check_hostname=True,
security_protocol='SSL',
topic='rasa_core_events')
If the ``ssl_check_hostname`` parameter is enabled, the clients will verify
if the broker's hostname matches the certificate. It's used on client's connections
and inter-broker connections to prevent man-in-the-middle attacks.


Implementing a Kafka Event Consumer
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

The parameters used to create a Kafka consumer is the same used on the producer creation,
according to the security protocol being used. The following implementation shows an example:

.. code-block:: python
from kafka import KafkaConsumer
from json import loads
consumer = KafkaConsumer('rasa_core_events',
bootstrap_servers=['localhost:29093'],
value_deserializer=lambda m: json.loads(m.decode('utf-8')),
security_protocol='SSL',
ssl_check_hostname=False,
ssl_cafile='CARoot.pem',
ssl_certfile='certificate.pem',
ssl_keyfile='key.pem')
for message in consumer:
print(message.value)
.. include:: feedback.inc
2 changes: 1 addition & 1 deletion docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@
#
# The short X.Y version.
__version__ = None
exec(open('../rasa_core/version.py').read())
exec(open('../rasa/version.py').read())
version = ".".join(__version__.split(".")[:2])
# The full version, including alpha/beta/rc tags.
release = __version__
Expand Down
11 changes: 11 additions & 0 deletions rasa/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
import logging

import rasa.version

from rasa.run import run
from rasa.train import train
from rasa.test import test

logging.getLogger(__name__).addHandler(logging.NullHandler())

__version__ = rasa.version.__version__
72 changes: 72 additions & 0 deletions rasa/__main__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
import argparse
import logging

from rasa_core.cli.arguments import add_logging_option_arguments
from rasa_core.utils import configure_colored_logging

from rasa import version
from rasa.cli import (scaffold, run, train, interactive,
shell, test, show, data, up)
from rasa.cli.utils import parse_last_positional_argument_as_model_path

logger = logging.getLogger(__name__)


def create_argument_parser() -> argparse.ArgumentParser:
"""Parse all the command line arguments for the training script."""

parser = argparse.ArgumentParser(
prog="rasa",
formatter_class=argparse.ArgumentDefaultsHelpFormatter,
description="Rasa command line interface. Rasa allows you to build "
"your own conversational assistants 🤖. The 'rasa' command "
"allows you to easily run most common commands like "
"creating a new bot, training or evaluating models.")

parser.add_argument("--version", action='store_true',
default=argparse.SUPPRESS,
help="Print installed Rasa version")

parent_parser = argparse.ArgumentParser(add_help=False)
add_logging_option_arguments(parent_parser)
parent_parsers = [parent_parser]

subparsers = parser.add_subparsers(help='Rasa commands')

scaffold.add_subparser(subparsers, parents=parent_parsers)
run.add_subparser(subparsers, parents=parent_parsers)
shell.add_subparser(subparsers, parents=parent_parsers)
train.add_subparser(subparsers, parents=parent_parsers)
interactive.add_subparser(subparsers, parents=parent_parsers)
test.add_subparser(subparsers, parents=parent_parsers)
show.add_subparser(subparsers, parents=parent_parsers)
data.add_subparser(subparsers, parents=parent_parsers)
up.add_subparser(subparsers, parents=parent_parsers)

return parser


def print_version() -> None:
print("Rasa", version.__version__)


def main() -> None:
# Running as standalone python application
parse_last_positional_argument_as_model_path()
arg_parser = create_argument_parser()
cmdline_arguments = arg_parser.parse_args()

if hasattr(cmdline_arguments, "func"):
configure_colored_logging(cmdline_arguments.loglevel)
cmdline_arguments.func(cmdline_arguments)
elif hasattr(cmdline_arguments, "version"):
print_version()
else:
# user has not provided a subcommand, let's print the help
logger.error("No command specified.")
arg_parser.print_help()
exit(1)


if __name__ == '__main__':
main()
Empty file added rasa/cli/__init__.py
Empty file.

0 comments on commit 8d17264

Please sign in to comment.