For starters, used mysql for db. But can be replaced with ElasticSearch as to enhance performance on scaling.
- Clone the repo
- install dependecies
pip install -r requirements.txt
- Setup Flask env
export FLASK_APP=server.py
- Create config file
cp config_example.py config.py
-
Add all necessary config data in config.py
-
Create a database with the provided name and table accordingly with the column (client, log_level, message, created_at).
- Run the
python zmqclient.py
on the client systems, i.e. various different machines where the logs are scattered
- Run the
python zmqserver.py
on our central server where we want to centralise the logging system
-
(Optional) To debug run :
export FLASK_DEBUG=1
-
Run the Flask web server
flask run
-
Open localhost:5000 on local system and view it as a table
-
To view statistics, go to localhost:5000/stats
To view server server stats, just visit localhost:5000/stats and the data can be refreshed by clicking on the refresh button.