elasticsearch, logstash and kibana configuration for pi-hole visualization
show, search, filter and customize pi-hole statistics ... the elk way
please note, this is still work in progress, so please let me know if I've left anything unclear/incorrect which definitely could be the case!
working installation of:
- logstash (tested with "6.5.0")
- elasticsearch (tested with "6.5.0")
- kibana (tested with "6.5.0")
- filebeat on pi-hole (tested with "1.3.1")
-> installation of the elk stack - refer to https://wiki.kaldenhoven.org/display/LIN/Elastic+Stack+on+Ubuntu+16.04+with+AdoptOpenJDK or https://www.elastic.co/ for details.
this repo provides the relevant files and configuration for sending the pi-hole logs via filebeat directly to logstash/elasticsearch. We will then visualize the logs in kibana with a custom dashboard.
The result will look like this:
HOW TO USE
- copy "/conf.d/20-dns-syslog.conf" to your logstash folder (usually /etc/logstash)
- customize "ELASTICSEARCHHOST:PORT" in the output section at the bottom of the file
- copy "dns" to "/etc/logstash/patterns/"
- restart logstash
- copy "/etc/filebeat/filebeat.yml" to your filebeat installation at the pi-hole instance
- customize "LOGSTASHHOST:5141" to match your logstash hostname/ip
- restart filebeat
- copy 99-pihole-log-facility.conf to /etc/dnsmasq.d/
- restart pi-hole
KIBANA HOST (CAN BE THE SAME AS LOGSTASH AND ELASTICSEARCH)
- import "elk-hole.json" into kibana: management - saved objects - import
- optionally reload kibanas field list
You should then be able to see your new dashboard and visualizations.