Skip to content

Commit

Permalink
Merge pull request #314 from ScatterHQ/logstash
Browse files Browse the repository at this point in the history
Fix Logstash config
  • Loading branch information
itamarst committed Jul 15, 2018
2 parents f7eb243 + 1ab92eb commit 9bd2967
Show file tree
Hide file tree
Showing 3 changed files with 4 additions and 21 deletions.
2 changes: 1 addition & 1 deletion docs/source/news.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ What's New
Documentation:

* Documented how to add log levels, and how to filter Eliot logs.

* Logstash configuration is closer to modern version's options, though still untested.

1.3.0
^^^^^
Expand Down
2 changes: 2 additions & 0 deletions docs/source/outputting/elasticsearch.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
Using Logstash and ElasticSearch to Process Eliot Logs
======================================================

.. note:: Logstash, Elasticsearch and Kibana change frequently. These instructions might not be quite accurate.

`ElasticSearch`_ is a search and analytics engine which can be used to store Eliot logging output.
The logs can then be browsed by humans using the `Kibana`_ web UI, or on the command-line using the `logstash-cli`_ tool.
Automated systems can access the logs using the ElasticSearch query API.
Expand Down
21 changes: 1 addition & 20 deletions docs/source/outputting/logstash_standalone.conf
Original file line number Diff line number Diff line change
Expand Up @@ -22,29 +22,10 @@ output {
}

elasticsearch {
# Documents in ElasticSearch are identified by tuples of (index, mapping
# type, document_id).
# References:
# - http://logstash.net/docs/1.3.2/outputs/elasticsearch
# - http://stackoverflow.com/questions/15025876/what-is-an-index-in-elasticsearch

# We make the document id unique (for a specific index/mapping type pair) by
# using the relevant Eliot fields. This means replaying messages will not
# result in duplicates, as long as the replayed messages end up in the same
# index (see below).
# index.
document_id => "%{task_uuid}_%{task_level}"

# By default logstash sets the index to include the current date. When we
# get to point of replaying log files on startup for crash recovery we might
# want to use the last modified date of the file instead of current date,
# otherwise we'll get documents ending up in wrong index.

#index => "logstash-%{+YYYY.MM.dd}"

index_type => "Eliot"

# In a centralized ElasticSearch setup we'd be specifying host/port
# or some such. In this setup we run it ourselves:
embedded => true
}
}

0 comments on commit 9bd2967

Please sign in to comment.