Demonstration of logging system for Python projects. Fast and safe logs collecting.
- Python project -- sent logs.
- Rsyslog -- collect logs from projects (can be many rsyslogs on many servers).
- Redis -- message queue between rsyslog and logstash.
- Logstash -- retrieve data from Redis, select index and add into ElasticSearch.
- ElasticSearch -- logs storage.
- Kibana -- web-interface.
- Run:
sudo docker-compose up
- Open Kibana: 127.0.0.1:5601/app/kibana
- Go to Management -> Index patterns.
- Click on "refresh fields". If "create" button still inactive then wait while ElasticSearch is ran.
- Click "create"
- Go to "Discover". This is your data :)
If you want to see indices then go to "Dev Tools" section and run this command:
GET /_cat/indices
Example of one log message from ElasticSearch:
{
"@timestamp": "2018-04-27T12:18:39.199Z",
"@version": "1",
"message": {
"name": "app_name",
"module": "app",
"lineno": 79,
"message": null,
"random_string": "ydrvlhdruj",
"random_integer": 302
},
"facility_label": "user",
"facility": "1",
"hostname": "pythonsysloglogstash_psl-project_1.pythonsysloglogstash_default",
"program": "",
"relayhost": "pythonsysloglogstash_psl-project_1.pythonsysloglogstash_default",
"relayip": "172.21.0.7",
"severity_label": "crit",
"severity": "2",
"tag": "",
"type": "syslog"
}