Skip to content

Commit

Permalink
Logstash/ElasticSearch documentation.
Browse files Browse the repository at this point in the history
  • Loading branch information
itamarst committed Oct 18, 2014
1 parent 39e4f7c commit 1dd85ae
Show file tree
Hide file tree
Showing 4 changed files with 82 additions and 0 deletions.
30 changes: 30 additions & 0 deletions docs/source/elasticsearch.rst
@@ -0,0 +1,30 @@
Using Logstash and ElasticSearch to Process Eliot Logs
======================================================

`ElasticSearch`_ is a search and analytics engine which can be used to store Eliot logging output.
The logs can then be viewed using `Kibana`_ web UI, searched with the ElasticSearch query API, or on the command-line using the `logstash-cli`_ tool.
`Logstash`_ is a log processing tool that can be used to load Eliot log files into ElasticSearch.
The combination of ElasticSearch, Logstash and Kibana is sometimes referred to as ELK.

.. _logstash-cli: https://github.com/jedi4ever/logstash-cli
.. _Logstash: http://logstash.net/
.. _ElasticSearch: http://elasticsearch.org
.. _Kibana: http://www.elasticsearch.org/overview/kibana/


Example Logstash Configuration
------------------------------

Assuming each Eliot message is written out as a JSON message on its own line (which is the case for ``eliot.to_file()`` and ``eliot.logwriter.ThreadedFileWriter``), the following Logstash configuration will load these log messages into an in-process ElasticSearch database:

:download:`logstash_standalone.conf`

.. literalinclude:: logstash_standalone.conf

We can then pipe JSON messages from Eliot into ElasticSearch using Logstash:

.. code-block:: console
$ python examples/stdout.py | logstash web -- agent --config logstash_standalone.conf
You can then use the Kibana UI to search and browse the logs by visiting http://localhost:9292/.
1 change: 1 addition & 0 deletions docs/source/index.rst
Expand Up @@ -9,6 +9,7 @@ Contents:
actions
types
types-testing
elasticsearch
twisted
fields
threads
Expand Down
50 changes: 50 additions & 0 deletions docs/source/logstash_standalone.conf
@@ -0,0 +1,50 @@
input {
stdin {
codec => json_lines {
charset => "UTF-8"
}
}
}

filter {
date {
# Parse Eliot timestamp filed into the special @timestamp field Logstash
# expects:
match => [ "timestamp", "UNIX" ]
target => ["@timestamp"]
}
}

output {
# Stdout output for debugging:
stdout {
codec => rubydebug
}

elasticsearch {
# Documents in ElasticSearch are identified by tuples of (index, mapping
# type, document_id).
# References:
# - http://logstash.net/docs/1.3.2/outputs/elasticsearch
# - http://stackoverflow.com/questions/15025876/what-is-an-index-in-elasticsearch

# We make the document id unique (for a specific index/mapping type pair) by
# using the relevant Eliot fields. This means replaying messages will not
# result in duplicates, as long as the replayed messages end up in the same
# index (see below).
document_id => "%{task_uuid}_%{task_level}_%{action_counter}"

# By default logstash sets the index to include the current date. When we
# get to point of replaying log files on startup for crash recovery we might
# want to use the last modified date of the file instead of current date,
# otherwise we'll get documents ending up in wrong index.

#index => "logstash-%{+YYYY.MM.dd}"

index_type => "Eliot"

# In a centralized ElasticSearch setup we'd be specifying host/port
# or some such. In this setup we run it ourselves:
embedded => true
}
}
1 change: 1 addition & 0 deletions docs/source/news.rst
Expand Up @@ -8,6 +8,7 @@ Features:

* Most public methods and functions now have underscore-based equivalents to the camel case versions, e.g. ``eliot.write_traceback`` and ``eliot.writeTraceback``, for use in PEP 8 styled programs.
Twisted-facing APIs and pyunit assertions do not provide these additional APIs, as camel-case is the native idiom.
* Documented how to load Eliot logging into ElasticSearch via Logstash.


0.4.0
Expand Down

0 comments on commit 1dd85ae

Please sign in to comment.