This project uses kafka streams to realtime interpret syslog, and insert to mongoDB. Syslog include message file.
0. Download and unpack kafka_2.13-2.8.0.tgz to get zookeeper, kafka server, connect start shell script and config file.
$ bash start-zookeeper-kafka-connect-mongodb.sh
$ docker-compose up -f docker/docker-compose.yml
$ bash ./syslog-messages/connect-distributed-connector-bin/create-local-file-source-connector.sh
$ bash ./docker/syslog-messages/messages-file-source/start-messages-file-source-connector.sh
$ java -cp syslog.realtime.interpret.kafka.streams.messages-0.0.1-SNAPSHOT.jar syslog.realtime.interpret.kafka.streams.messages.LineSplit
$ bash ./docker/syslog-messages/kafka.streams.messages/start-kafka-streams-messages.sh
$ bash ./syslog-messages/connect-dirtributed-connector-bin/create-mongodb-sink-connector.sh
$ bash ./docker/syslog-messages/messages-mongodb-sink/start-messages-mongodb-sink-connector.sh
Address: localhost Port: 27017
$ docker run -it --rm -p 9000:9000 --network host -e KAFKA_BROKERCONNECT=localhost:9092 -e JVM_OPTS="-Xms32M -Xmx64M" -e SERVER_SERVLET_CONTEXTPATH="/" obsidiandynamics/kafdrop:latest
$ docker run -it --rm --network host -e ZK_HOSTS="localhost:2181" -e APPLICATION_SECRET=letmein sheepkiller/kafka-manager
access with browser at http://127.0.0.1:9000