The log shipper should be able to read logs from a configurable source and send these forward to the receiver. The scope of this exercise is to have the file system as a supported source and for this source, the following should to be supported:
- Detecting when new files are added to the observed directories on the file system
- Post the rows within the text files in the directories to the REST API exposed by the log receiver
- Delta updates is desirable but not required where added rows to existing log files are written after initial upload
The log receiver accepts incoming log rows, stores these and makes them available for an end-user to read. An REST API should be exposed for receiving the log rows and for reading them
- Only unique rows shall be stored
- The id parameter in the incoming logs should be used for uniqueness
- Basic filtering is a plus but not required where the user can filter on time received in the read API in the log receiver
- The logs must be stored in-memory of the receiver
- A bonus (but not a requirement) is to store the logs in a persistent store in addition to the in-memory store
The implementation of Snow coding challenge for Log Management solution is consists of two main components: a web server (Log receiver) and log watcher (Log shipper). Both components share a single Dockerfile and internal libraries for convenience. Tests are added only for InMemory storage so far.
It utilizes some commonly used libraries and tools to facilitate development, such as:
- oapi-codegen Go-centric OpenAPI Client and Server Code Generator.
- go-chi A lightweight, idiomatic Go HTTP server and router.
- slog A new structured logger from standard library.
- testify A handy toolkit for assertion in tests.
For complete API documentation see the OpenAPI Spec document. You can view this document at Swagger Editor.
This project follows design and development principles described in:
Use docker-compose to build and run both services:
docker-compose up --build
Add log files to configured directory (/testdata as default) and fetch collected logs:
curl http://localhost:8080/v1/logs
[
{
"attributes": {},
"id": "a5843dcb-9f21-4123-9c7c-688f0e8b88a7",
"message": "Task faulted: 'Failed to listen on port 80'",
"severity": "Error",
"timestamp": "2021-11-10T13:18:52Z"
},
{
"attributes": {
"test": "test"
},
"id": "0d3f329c-3d20-4975-9beb-cf4425d3a138",
"message": "Task faulted 3: 'Failed to listen on port 80'",
"severity": "Error",
"timestamp": "2021-11-10T13:18:54Z"
}
]
Get log entries by ID:
curl http://localhost:8080/v1/logs/a5843dcb-9f21-4123-9c7c-688f0e8b88a7
{
"attributes": {},
"id": "a5843dcb-9f21-4123-9c7c-688f0e8b88a7",
"message": "Task faulted: 'Failed to listen on port 80'",
"severity": "Error",
"timestamp": "2021-11-10T13:18:52Z"
}
Filter log entries by timestamp:
curl 'http://localhost:8080/v1/logs?from=2021-11-10T13%3A18%3A53Z&to=2021-11-10T13%3A18%3A55Z'
[
{
"attributes": {
"test": "test"
},
"id": "0d3f329c-3d20-4975-9beb-cf4425d3a138",
"message": "Task faulted 3: 'Failed to listen on port 80'",
"severity": "Error",
"timestamp": "2021-11-10T13:18:54Z"
}
]