Table of Contents
1 Nginx endpoint at 3000 will function as a load balancer distributing the incoming req b/w the two express servers (we can add more as required)
2 Express web servers - there only job is to publish the log message to a message queue , in our case RabbitMQ
3 RabbitMq - its job is to fill the queue and provide consumers a steady flow to consume log messages
4 Message Consumer & REST server - its job is to consume the messages and post them to a datasource in this case Postgres RDBMS , here i am posting messages via a simple POST request
5 PostGres RDBMS - its job is to function as a datasource , provide queryable interface to other consumers
- React - library
- Express - HTTP server
- RabbitMq - Message Broker
- Tailwind - CSS
- Seqlize - ORM
- Postgres - RDBMS
- nginx - loadbalancer
This is an example of how you may give instructions on setting up your project locally. To get a local copy up and running follow these simple example steps.
First clone client-nginx folder.
Make sure you have Docker installed in your PC
Then checkout to the folder you just cloned and run the following
- docker-compose
docker-compose up --build
You should now see 1 nginx , 2 node servers, 1 posgres rds ,1 rabbitmq containers
- Clone the Consumer-backend & Consumer-frontend repos
- After cloning the Consumer-backend run (inside the Consumer-backend) -
Ensure that all dependencies are installed. After that we need to run some db migrations
npm i
npx sequelize-cli db:migrate
- After running the migrations we can start up our consumer-backend server
npm run dev
- To Verify if the consumer-backend is working fine check the following endpoints
http://localhost:8081/ http://localhost:8081/hello
These endpoints ensure your backend is up and connected to db
-
Now we can run our frontEnd , clone the Consumer-fronend repo and simply run
npm run dev
-
Now you can view the frontend at http://localhost:5173/
-
Hurray your system setup is complete
You can post log messages in the given format
{
"level": "problem",
"message": " a new message to abhi",
"resourceId": "server-1434",
"timestamp": "2023-09-15T08:00:00Z",
"traceId": "abc-xyz-123",
"spanId": "span-456",
"commit": "5e5342f",
"metadata": {
"parentResourceId": "server-0987"
}
}
Use Postman to post these messages to
http://localhost:3000/logs
Now you can use the frontend to filter and search through the message with the filters provided
Filters work on [And] basis.
For more examples, please refer to the Documentation
- Find all logs with the level set to "error".
- Search for logs with the message containing the term "Failed to connect"
- Retrieve all logs related to resourceId "server-1234"
- Filter logs between the timestamp "2023-09-10T00:00:00Z" and "2023-09-15T23:59:59Z". (Bonus)
- Allow combining multiple filters
There are many improvements that could be made to the system. Given the time constraint I could not explore and implement.
Some of the Improvements/Explorations that could be made -
- Implementing a full text search vector feature of postgres for faster search
- Implementing a role based login for the query UI
- Using a time-series DB or something like https://www.mongodb.com/docs/manual/core/timeseries-collections/ as a datasource
- Exploring a cloud based saving solution to save old logs to something like an S3
- Maybe using something other than http calls
- Exploring possibilities with elastic search
Abhinav Singh - abhinav16197@gmail.com
- Thanks to the Dyte team for providing this problem statement .
- It was an amazing learning experience for me building this system end to end
- Would love to know what you guys think of my solution/submission