An extenstion to the epics2kafka Kafka Connector that adds a Transform plugin to serialize messages in the format required by JAWS.
The following transformation is performed:
Value: epics-monitor-event-value -> AlarmActivationUnion.avsc
Note: epics2kafka must be configured to use the optional outkey field to ensure the alarm name is used as the key and not the channel name, which is the default. The registrations2epics app handles this.
- Grab project
git clone https://github.com/JeffersonLab/jaws-epics2kafka
cd jaws-epics2kafka
- Launch Compose
docker compose up
- Monitor the alarm-activations topic
docker exec -it cli list_activations --monitor
- Trip an alarm
docker exec softioc caput channel1 1
- Request invalid channel to verify error is provided
docker exec epics2kafka /scripts/set-monitored.sh -t alarm-activations -c invalid_channel -m va
Copy the jaws-epics2kafka.jar and it's core direct dependencies into a subdirectory of the Kafka plugins directory. For example:
mkdir /opt/kafka/plugins/jaws-epics2kafka
cp jaws-epics2kafka*.jar /opt/kafka/plugins/jaws-epics2kafka
cp jaws-libj*.jar /opt/kafka/plugins/jaws-epics2kafka
cp kafka-common*.jar /opt/kafka/plugins/jaws-epics2kafka
Note: The epics2kafka*.jar
should be in a separate plugins subdirectory from jaws-epics2kafka. Since they share kafka-common*.jar it's likely safest to remove that jar from each plugin subdirectory and move it to kafka/libs.
You'll also need to ensure the plugin has access to it's platform dependencies: Confluent Kafka. Many are already in the kafka libs directory, but AVRO and Confluent AVRO/Registry related dependencies must be copied into Kafka libs (if you're not already using a Confluent distribution of Kafka). The easiest way may be to download the Confluent Community Edition and cherry pick a few jars out of it. Else download each jar individually from Maven Central:
curl -O https://repo1.maven.org/maven2/org/apache/avro/avro/1.11.2/avro-1.11.2.jar
curl -O https://packages.confluent.io/maven/io/confluent/kafka-schema-registry-client/7.4.0/kafka-schema-registry-client-7.4.0.jar
curl -O https://packages.confluent.io/maven/io/confluent/kafka-schema-serializer/7.4.0/kafka-schema-serializer-7.4.0.jar
curl -O https://packages.confluent.io/maven/io/confluent/kafka-schema-converter/7.4.0/kafka-schema-converter-7.4.0.jar
curl -O https://packages.confluent.io/maven/io/confluent/kafka-avro-serializer/7.4.0/kafka-avro-serializer-7.4.0.jar
curl -O https://packages.confluent.io/maven/io/confluent/kafka-connect-avro-converter/7.4.0/kafka-connect-avro-converter-7.4.0.jar
curl -O https://packages.confluent.io/maven/io/confluent/kafka-connect-avro-data/7.4.0/kafka-connect-avro-data-7.4.0.jar
curl -O https://packages.confluent.io/maven/io/confluent/common-utils/7.4.0/common-utils-7.4.0.jar
curl -O https://packages.confluent.io/maven/io/confluent/common-config/7.4.0/common-config-7.4.0.jar
curl -O https://repo1.maven.org/maven2/com/google/guava/guava/30.1.1-jre/guava-30.1.1-jre.jar
curl -O https://repo1.maven.org/maven2/com/google/guava/failureaccess/1.0.1/failureaccess-1.0.1.jar
The Connect configuration (JSON):
"transforms": "alarmsValue",
"transforms.alarmsValue.type": "org.jlab.jaws.EpicsToAlarm$Value
Set the environment variable USE_NO_ACTIVATION=false
(defaults to true) to replace NoActivation messages with tombstones (null) instead.
This project is built with Java 17 (compiled to Java 11 bytecode), and uses the Gradle 7 build tool to automatically download dependencies and build the project from source:
git clone https://github.com/JeffersonLab/jaws-epics2kafka
cd jaws-epics2kafka
gradlew installDist
Note: If you do not already have Gradle installed, it will be installed automatically by the wrapper script included in the source
Note for JLab On-Site Users: Jefferson Lab has an intercepting proxy
See: Docker Development Quick Reference
In order to iterate rapidly when making changes it's often useful to run the app directly on the local workstation, perhaps leveraging an IDE. This app runs as a plugin to Kafka, and depends on Kafka being installed and having the epics2kafka Connector (plugin) already installed as well. It's therefore easier to get started (but slower to iterate) to simply rely on the Docker container build. In this scenario run the container build with:
docker compose -f build.yaml --progress=plain build --no-cache epics2kafka
Then run with:
docker compose -f build.yaml up
Re-deploy after making code changes with:
docker compose -f build.yaml down
docker compose -f build.yaml --progress=plain build --no-cache epics2kafka
docker compose -f build.yaml up
Note: For faster iteration it is possible to use deps.yaml
for service dependencies and then install and configure a local instance of Kafka for hosting the Connect app. Scripted deployment (example: gradlew run
) would be
nice for this scenario (but isn't currently provided).
- Bump the version number in the VERSION file and commit and push to GitHub (using Semantic Versioning).
- The CD GitHub Action should run automatically invoking:
- The Create release GitHub Action to tag the source and create release notes summarizing any pull requests. Edit the release notes to add any missing details. A zip file artifact is attached to the release.
- The Publish docker image GitHub Action to create a new demo Docker image.