Demonstrate InterSystems IRIS and Kafka integration. In this solution, Kafka is installed in the IRIS container. If you're looking for a solution where IRIS is running natively with a containerised implementation of Kafka, plesae see here OR for a solution running IRIS and Kafka in separate containers, see here
Make sure you have git and Docker desktop installed.
Clone/git pull the repo into any local directory
$ git clone https://github.com/isc-krakshith/iris-kafka-docker.git
Open the terminal in this directory and run:
$ docker-compose build
Among other things, iris.script which is called by the installer at this stage populates the Orders table with synthetic data.
- Run the IRIS container with your project:
$ docker-compose up -d
Open InterSystems IRIS Management Portal on your browser
default account _SYSTEM / SYS will need to be changed at first login
Start the Kafka.TraderProduction, by clicking the "Start" button
++++++++++++++++++++++++++
docker-compose exec iris bash
CD TO KAFKA INSTALLATION
cd /kafka/kafka_2.13-3.0.1/
In the frist shell...
bin/zookeeper-server-start.sh config/zookeeper.properties
In the second shell...
bin/kafka-server-start.sh config/server.properties
In the third shell...
bin/kafka-topics.sh --create --topic bids-Asks --bootstrap-server localhost:9092
bin/kafka-topics.sh --describe --topic bids-Asks --bootstrap-server localhost:9092
bin/kafka-console-producer.sh --topic bids-Asks --bootstrap-server localhost:9092
In the fourth shell...
bin/kafka-console-consumer.sh --topic trades --bootstrap-server localhost:9092
Go back to the third shell... Then generate bid-ask events, one line at a time... After each event is produced, the resulting trades topic events are visbile the fourth shell described above as well as in the Management Portal (Select any of the components of the production by clicking once -> On the right hand pane, select the "Messages" tab, then any of the messages in the list to explore them in depth)
{"dateTime":"2022-06-07T13:16:22.000","ref":"OH77BBN3", "security":"SECA", "bid":50, "ask":0, "vol":300}
{"dateTime":"2022-06-07T13:17:32.000","ref":"OH77CBN3", "security":"SECB", "bid":0, "ask":50, "vol":400}
{"dateTime":"2022-06-07T13:18:42.000","ref":"OH77DBN3", "security":"SECC", "bid":0, "ask":55, "vol":200}
{"dateTime":"2022-06-07T13:19:52.000","ref":"OH77EBN3", "security":"SECD", "bid":70, "ask":0, "vol":250}
Dockerfile which starts IRIS and imports Installer.cls and then runs the Installer.setup method, which creates KAFKA Namespace and imports ObjectScript code from /src and /testdata folders into it.
Docker host volume bind makes this available within the container at /kafka/kafka_2.13-3.0.1
Contains the InterSystems IRIS Production settings, which brings up EnsLib.Kafka.Service and EnsLib.Kafka.Operation and specifies the Kafka broker and topic parameters. It also contains the prameters used by the business process.
All the business logic is here
Creates a Kafka client using native API calls and runs tests on it to verify succesful functioning.
$ docker-compose exec iris iris session iris
USER> zn "KAFKA"
KAFKA> do ##class(Kafka.TestKafkaMessagingClient).KafkaClient()
