Skip to content

Parsing logs locally

Kirill Makhonin edited this page Nov 1, 2018 · 2 revisions

For analyzing logs locally in a complex cases you may use ELK stack (Elasticsearch + Logstash + Kibana).

In order to do it you have to:

  1. Download all logs from S3. You may use AWS CLI
aws s3 cp --recursive "s3://<bucket>/<filter>" .
  1. Concat all logs to single file
find ../raw -type f -exec cat {} >> merged.log \;
  1. Clone deviantony/docker-elk repository
https://github.com/deviantony/docker-elk.git
  1. Modify logstash configuration (<docker-elk repo>/logstash/pipeline/logstash.conf)
input {
	tcp {
		port => 5000
	}
}

filter {
	grok {
		match => {
			"message" => "%{DATA:timestamp}	(?<source>[^\t]+)	(?<payload>[^\t]+)"
		}
		remove_field => ["host", "message"]
	}
	date {
		match => [ "timestamp", "ISO8601" ]
		remove_field => ["timestamp"]
	}
	json {
		source => "payload"
		remove_field => ["payload"]
	}
}

output {
	elasticsearch {
		hosts => "elasticsearch:9200"
	}
}
  1. Start ELK stack using compose
docker-compose up
  1. Await finished of loading
  2. Create index
curl -XPOST -D- 'http://localhost:5601/api/saved_objects/index-pattern' \
    -H 'Content-Type: application/json' \
    -H 'kbn-version: 6.4.2' \
    -d '{"attributes":{"title":"logstash-*","timeFieldName":"@timestamp"}}'
  1. Configure logstash
curl -XPUT -H "Content-Type: application/json" http://localhost:9200/_all/_settings -d '{"index.blocks.read_only_allow_delete": null}' 
  1. Send data to logstash
nc localhost 5000 < merged.log

And then you can view logs on Kibana: http://localhost:5601/

Clone this wiki locally