Skip to content

Log aggregation overview

Deepak Narayana Rao edited this page Oct 16, 2017 · 8 revisions

Tech Stack

  • ELK: Log Aggregation
    • Elasticsearch : Log storage and search APIs
    • Logstash: Transforms logs into structured data to be stored in elsticsearch
    • Kibana: UI to search logs and visualise data
  • Logspout: Ship logs from containers
  • Filebeat: Ship logs from VMs
  • Oauth proxy: Google auth for accessing kibana

Please try this simple walkthrough tutorial to get hands on experience on log aggregation https://botleg.com/stories/log-management-of-docker-swarm-with-elk-stack/

Overview

Image : Edit Link

Logstash setup

  • Logstash is run as a service inside the docker swarm
  • Logstash service is run with syslog input plugin which enables logstash to acts as syslog server
  • Logstash's syslog port is published in swarm and this available on swarm worker nodes. There is a internal TCP load balancer for all the worker nodes. This load balancer exposes logstash to services outside the swarm

Collecting logs from containers in docker swarm

Note: docker service logs or docker logs returns logs written to stdout and stderr. Hence all the services / containers running inside docker swarm should have logs written to stdout for informational logs and stderr for error logs

  • Logspout pushes the logs to Logstash using syslog protocol. Example config syslog+tcp://logger_logstash:51415
  • Logstash parses logs to structure the data and pushes the structured logs to elasticsearch

Image Source: https://jujucharms.com/u/lazypower/logspout/

Collecting logs from services outside docker swarm

  • Services running outside the swarm (stateful service like databases) should have logs directed to /var/syslog
  • Filebeat installed on all servers is configured to fetch logs from /var/syslog and push to logstash
  • Logstash parses logs to structure the data and pushes the structured logs to elasticsearch

Image Source: https://logz.io/blog/filebeat-vs-logstash/

Clone this wiki locally