-
Beats (FileBeat for log files)
Run setup-win.sh
from Git Bash prompt to download and unzip all packages.
start-db-and-ui.cmd
starts Elastic Search and Kibana with default config.start-logstash.cmd
starts Logstashstart-log-generator.cmd
starts a .NET Core console app that generates logs.start-filebeat.cmd
start FileBeat configured to load the logs generated by the .NET Core app.
-
Inputs
- TCP Port 5000 - beats.
- TCP Port 5001 - plain text for testing.
- TCP Port 5002 - json for testing.
Telnet can be used to connect to the testing ports and send log events for parsing.
-
Filters
- Fingerprint - calculates a hash of the log message which is then used to de-duplicate log entries in Elastic Search.
- Log4Net - processes only messages having
logtype=Log4Net
. It parses the common Log4Net log pattern into a timestamp, log level, thread, logger and message.
-
Outputs
- Elastic Search - for non-test log events
- Stdout - for test log events. This is used to debug and tune the filters.
The Logstash config is split accross multiple files under config/logstash
:
# config/logstash/01_input-beats.conf:
input {
beats {
id => "input-beats"
port => 5000 }
}
# config/logstash/01_input-tcp-json.conf:
input {
tcp {
id => "input-tcp-json-test"
port=>5002
codec => json
tags => ["test"]
}
}
# config/logstash/01_input-tcp-plain-test.conf:
input {
tcp {
id => "input-tcp-plain-test"
port=>5001
add_field => {
"logtype" => "log4net"
}
tags => ["test"]
}
}
# config/logstash/50_fingerprint.conf:
filter {
fingerprint {
id => "fingerprint"
method => "MD5"
key => "key"
target => "doc_id"
base64encode => false
}
}
# config/logstash/55_log4net.conf:
filter {
if ([logtype] == "log4net")
{
grok {
id => "match-message"
match => {
"message" => "^(?<ts>2\d{3}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}[,.]\d{3,4}) (\[(?<thread>\w+)\] )?(?<level>\w+)\s+(?<logger>[^\s]+)( \[(?<ctx>[^\]]+)\])? - (?<message>.*)"
}
overwrite => [ "message" ]
}
date {
id => "parse-timestamp"
match => [ "ts" , "yyyy-MM-dd HH:mm:ss.SSS", "yyyy-MM-dd HH:mm:ss,SSS", "yyyy-MM-dd HH:mm:ss.SSSS", "yyyy-MM-dd HH:mm:ss,SSSS" ]
timezone => "UTC"
remove_field => [ "ts" ]
}
}
}
# config/logstash/90_elasticsearch.conf:
output {
if ("test" not in [tags]) {
elasticsearch {
doc_as_upsert => true
document_id => "%{doc_id}"
hosts => ["http://127.0.0.1:9200"]
id=> "elasticsearch"
}
}
}
# config/logstash/90_stdout_test.conf:
output {
if ("test" in [tags]) {
stdout {
id=> "stdout"
codec => rubydebug { metadata => true }
}
}
}
File Beat is configured to output to Logstash on port 5000 and to look for the prospector config in a sub-folder.
name: "filebeat-shipper"
tags: ["logs"]
fields_under_root: true
path.config: ./config/filebeat/
filebeat.config.prospectors:
path: 'config/*.yml'
reload.enabled: true
reload.period: 10s
output.logstash:
hosts: ["localhost:5000"]
enabled: true
output.console:
enabled: false
pretty: true
The log file tail config:
# config/filebeat/config/log_generator.yml:
- paths:
- log-generator\src\LogGenerator\logs\*.log
fields:
env: dev
logtype: log4net
fields_under_root: true
#tags: ["test"]
multiline.pattern: '^20[0-9]{2}-[01][0-9]-[0123][0-9]'
multiline.negate: true
multiline.match: after
ignore_older: 0