Skip to content

This is POC for centralised logging using ELK stack with Apache kafka, filebeat and spring boot.

Notifications You must be signed in to change notification settings

AbhiJD9602/elkk

Repository files navigation

ELKK Stack

The goal of this project is to implement centralized logging mechanism for spring boot applications.

Technologies used

  • Elastic Search

  • Logstash

  • Kibana

  • Kafka

  • Filebeat

  • Spring Boot

Project Architecture

full eco system

Applications

  • application

    Spring Boot Web Java application that generates logs and pushes logs events to log_stream topic in Kafka using Filebeat.

diagram

Start Environment

  • Open a terminal and inside elkk root folder run

    docker-compose up -d
  • Wait a until all containers are Up (healthy). You can check their status by running

    docker-compose ps

Running Applications with Gradle

Inside elkk root folder, run the following Gradle commands in different terminals

  • application

    ./gradlew :application:bootRun

Running Applications as Docker containers

Build Application’s Docker Image

  • In a terminal, make sure you are in elkk root folder

  • In order to build the applications docker images, run the following script

    ./build-apps.sh

Application’s Environment Variables

  • application

    Environment Variable Description

    ZIPKIN_HOST

    Specify host of the Zipkin distributed tracing system to use (default localhost)

    ZIPKIN_PORT

    Specify port of the Zipkin distributed tracing system to use (default 9411)

Start Application’s Docker Container

  • In a terminal, make sure you are inside elkk root folder

  • Run following script

    ./start-apps.sh

Configuring Kibana

  • You can then access kibana in your web browser: http://localhost:5601.

  • The first thing you have to do is to configure the ElasticSearch indices that can be displayed in Kibana.

kibana One
  • You can use the pattern logstash-* to include all the logs coming from FileBeat via Kafka.

  • You also need to define the field used as the log timestamp. You should use @timestamp as shown below:

kibana Two
  • And you are done. You can now visualize the logs generated by FileBeat, ElasticSearch, Kibana and your other containers in the Kibana interface:

kibana Three

Applications URLs

Application URL

application

http://localhost:9082/

kibana dashboard

http://localhost:5601

Shutdown

  • Stop applications

    • If they were started with Gradle, go to the terminals where they are running and press Ctrl+C

    • If they were started as a Docker container, run the script below

      ./stop-apps.sh
  • Stop and remove docker-compose containers, networks and volumes

    docker-compose down -v
  • Kafka Topics UI

    Kafka Topics UI can be accessed at http://localhost:8085

  • Zipkin

    Zipkin can be accessed at http://localhost:9411

  • Kafka Manager

    Kafka Manager can be accessed at http://localhost:9000

    Configuration

    • First, you must create a new cluster. Click on Cluster (dropdown button on the header) and then on Add Cluster

    • Type the name of your cluster in Cluster Name field, for example: MyZooCluster

    • Type zookeeper:2181 in Cluster Zookeeper Hosts field

    • Enable checkbox Poll consumer information (Not recommended for large # of consumers if ZK is used for offsets tracking on older Kafka versions)

    • Click on Save button at the bottom of the page.

  • Elasticsearch REST API

    Check ES is up and running

    curl http://localhost:9200

    Check indexes in ES

    curl http://localhost:9200/_cat/indices?v

    Check news index mapping

    curl http://localhost:9200/news/_mapping

    Simple search

    curl http://localhost:9200/news/news/_search

About

This is POC for centralised logging using ELK stack with Apache kafka, filebeat and spring boot.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published