Skip to content
AWS CloudWatch Logs to ELK
Branch: master
Clone or download
Latest commit e7ba844 Jan 25, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
assets upgraded elk to 6.2.4 (make use of the oss distribution) Apr 25, 2018
.gitignore first commit Sep 28, 2016
LICENSE add default license (MIT) Oct 18, 2016
README.md upgraded elk to 6.2.4 (make use of the oss distribution) Apr 25, 2018
cloudwatch_elk.groovy fix: duplicate method name Nov 29, 2018
docker-compose.yml

README.md

Gilt Tech logo

AWS CloudWatch to ELK

AWS CloudWatch to ELK is a command line interface (CLI) to extract events from Amazon CloudWatch logs and load them into a local Elasticsearch/Logstash/Kibana (ELK) cluster.

CloudWatch logs are not easy to read and analyse. It's not possible to aggregate the logs from different streams or use full text search capabilities.

The goal of this project is to easy setup an ELK stack and load the logs we need to analyse 'on demand'.

Requirements

Setup

  • Clone this repository
  • Start the ELK stack docker-compose up
  • Access Kibana via web browser http://$docker_machine_ip:5601. If everything is setup correctly you should see the view below

Kibana Home

Load your logs

Use the Groovy CLI to load your logs. Run it without any parameter to see how to use it

groovy cloudwatch_elk.groovy

usage: groovy cloudwatch_elk.groovy [options]
 -a,--profile <arg>       the AWS profile name to use
 -c,--cluster <arg>       the Elasticsearch cluster name, default
                          'cloudwatch-cluster'
 -d,--deleteData <arg>    when 'true' deletes the index if exists, default
                          'true'
 -e,--elasticHost <arg>   the Elasticsearch hostname / ip, default 'localhost'
 -f,--from <arg>          a point in time expressed as dd/MM/yy hh:mm
 -g,--logGroup <arg>      the CloudWatch log group of the instances
 -i,--instances <arg>     the instances to extract log from, separated by
                          comma
 -l,--lastMinutes <arg>   specify the number of minutes to extract logs
                          until now
 -n,--ec2Name <arg>       the EC2 name to retrieve instances
 -p,--elasticPort <arg>   the ElasticSearch port, default '9300'
 -t,--to <arg>            a point in time expressed as dd/MM/yy hh:mm

Examples

Load the last hour of logs from all the instances tagged with name 'my-service':

groovy cloudwatch_elk.groovy -g $logGroupName -n my-service -l 60

Load the last half an hour of logs from a set of instances tagged with name 'my-service':

groovy cloudwatch_elk.groovy -g $logGroupName -n my-service -i $instance1,$instance2,$instance3 -l 30

Load the logs between 2 specific date times from all the instances tagged with name 'my-service':

groovy cloudwatch_elk.groovy -g $logGroupName -n my-service -f '27/09/2016 08:00' -t '27/09/2016 09:00'

Analyse your logs

Setup Kibana to analyse your logs:

  • Access Kibana
  • Select Set up index patterns
  • Set the ec2 name used to extract the logs in the Index pattern field
  • Kibana will fetch the index mapping and propose timestamp as Time Filter field name
  • Click on Create Index Pattern and ... you are good to go!

Click on Discover and start analysing your logs.

Destroy the ELK stack

Logs are persisted in the ELK container. When you are done with your analysis execute docker-compose down to destroy the container.

You can’t perform that action at this time.