The project is written in python. There is a health-checker which makes requests to the provided websites and publishes the status_code, response_time onto the message broker on a specific topic, in this case it's Kafka. The consumer listens onto the same topic. It validates the message it has received and saves that information into a live table in the database.
The database consists of live
and historical
table. As the name suggests, the live
table has the most recent information of the website. There are two triggers on live
table which on INSERT / UPDATE actions will insert the information into historical
table. The historical information can be useful to understand at what times in the past the website was not reacheable or had a status code other than 200. It can also be used to understand if there was a significant delay in response from the server.
Below image explains the thought process behind the development of the codebase.
- Pipenv
- Python3
- Postgres server
- Kafka service
- Docker (For flyway migrations)
- First create the
.env
file with the contents from.env.test
. Replace them with actual values. - Activate the shell using
pipenv shell
. - Install the necessary dependencies
pipenv install
. - Run the database migrations
make run-migrations DB_HOST=<host> DB_PORT=<port> DB_NAME=<db_name> DB_USER=<user> DB_PASSWORD=<password>
. You can either use Aiven's Postgres or your local postgres server. - Start the consumer -
make start-consumer
. - Start the producer -
make start-producer
.
- Create the Kafka service on aiven.io and download the certs (service.cert, service.key, ca.pem)
- Copy the downloaded files into
src/certs
folder
python3 -m pytest tests/**/*.py