I was asked to solve an assignment for the SWE - Risk & Data position. For some notes see thought-process.md
. You can find the resulting csv files in data/final-results
.
I am using the build.sh
script in the build
directory, but it needs to be slightly adjusted, since it's using miniconda as the environment management system. Look at this script to see what's going on.
How to build without it:
- We will use the kafka-docker image as the docker image for Apache Kafka. Clone the source repository and replace the
docker-compose.yml
with the one in thebuild
directory. Then rundocker-compose up
. - Python version is 3.8, need to have
faust
andconfluent-kafka
modules. See therequirements.txt
in thebuild
directory. - Go into the
src
directory and put the data into the broker by runningpython produce.py
. After it's done, run the program byfaust -A process worker -l info
. Results can be found indata/results
.