A demo for Red Hat OpenShift Streams for Apache Kafka, and Red Hat OpenShift Serverless. Provides a "cheat detection" service for the Shipwars game used at Red Hat Summit 2021.
This demo uses OpenShift Serverless (based on the upstream Knative project) to process events generated by a game server. These events are stored in a Kafka Topic. A Kafka Source, provided by OpenShift Serverless, is used to fetch events from the Kafka Topic and route them to a cheat detection function. The cheat detection function can scale to zero when no events are available, thanks to OpenShift Serverless.
The cheat detection function will emit a new event if it suspects a player is cheating. This new event is routed to an email serverless function that uses SendGrid to notify game admins about the suspected cheater.
- A free Red Hat account, to access console.redhat.com
- Access to an OpenShift cluster, e.g via the free OpenShift DevSandbox.
- OpenShift Operators (these are pre-installed on DevSandbox):
- RHOAS 0.9.5
- OpenShift Serverless 1.19
- RHOAS CLI
- OpenShift CLI.
- JQ CLI.
- SendGrid Account with a verified sender address and API Key.
Note that this can all be automated/scripted. For the purpose of the demo I walk through these steps manually.
OpenShift Streams for Apache Kafka provides access to hosted and managed Apache Kafka clusters. Limited trial instances are available for 48 hours.
- Navigate to console.redhat.com/application-services/streams/kafkas in your browser.
- Login if you already have an account, or create a free account to access the service.
- Click the Create Kafka instance button and complete the dialog to provision a Kafka instance.
The Kafka instance will take a few minutes to provision. You can continue following these instructions while it is provisioning.
The OpenShift DevSandbox is a free, time-limited, OpenShift environment.
- Visit the DevSandbox Getting Started page.
- Follow the prompts, and login using the same account you used to create your Kafka instance.
After following login process you'll have access to an OpenShift environment.
- Select Add + from the side-menu in OpenShift DevSandbox.
- Scroll down and Managed Services.
- Choose the Red Hat OpenShift Application Services option. It will have an
Unlock with token
label. - Click the link in the on-screen instructions to obtain a token from console.redhat.com/openshift/token/, and paste it into the dialog then submit the form. You'll be returned to the Managed Services screen.
- On the Managed Services screen, choose the Red Hat OpenShift Streams for Apache Kafka tile and click Next.
- When prompted to Select a Kafka Instance, choose the instance you created at https://console.redhat.com/application-services/streams/kafkas/.
Your OpenShift project can now access the details for your OpenShift Streams for Apache Kafka instance!
A Service Account was created when you linked your Kafka instance to the OpenShift project. The Service Account provides a username (Client ID) and password (Client Secret) used for SASL authentication against your managed Kafka.
Service Accounts have limited access to Kafka instances by default. Update
the assigned permissions to allow produce/consume operations on the shots
topic.
The Service Account ID and Secret are stored in your OpenShift project in a Secret named rh-cloud-services-service-account.
Use the the RHOAS CLI to update the assigned permissions.
# Login to your OpenShift cluster
oc login --token=<your-token> --server=<your-cluster-api-url>
# Login to RHOAS
rhoas login
# Select the Kafka cluster that's connected to your OpenShift environment
rhoas kafka use
# Obtain the service account ID from the OpenShift cluster
export CLIENT_ID=$(oc get secret rh-cloud-services-service-account -o jsonpath='{.data.client-id}' | base64 --decode)
# Provide consume permissions to this service account for applications
# in the "knative-consumer" consumer group
rhoas kafka acl grant-access --consumer \
--service-account $CLIENT_ID --topic-prefix shipwars --group knative-consumer
Now create a topic named shipwars-bonuses
with 3 partitions:
rhoas kafka topic create --name shipwars-bonuses --partitions 3
A Broker is required to transport events within the OpenShift cluster.
The cheat detection service uses the broker URL to emit events that contain the
results of the auditing rules it applies. It emits CloudEvent format messages
to the Broker. Downstream services can register their interest in these events
using a Trigger, and process them. In this example, an email alerting service
will be subscribed to events of the type audit.fail.bonus
.
Create the Broker by applying the broker.yml:
oc apply -f openshift/broker.yml
You can confirm the Broker was created and entered the using the READY
state using the oc get brokers
command.
Deploy the Knative Serving Functions. These can process events generated by the game server, and other functions that emit events.
The source code for both of these Serverless Functions is included in this repository. Pre-built images are deployed to save time.
# The cheat detection service will HTTP POST events to this URL
export BROKER_URL=$(oc get brokers -o jsonpath='{.items[0].status.address.url}')
# If the cheat detection detects a potential cheating player, a notification
# can be sent via email, using SendGrid, to an email address of your choice
export SENDGRID_API_KEY='replace-with-your-free-api-key'
export EMAIL_FROM=audit-alerts@foobar.com
export EMAIL_TO=audit-department@foobar.com
# Deploy the serverless functions
oc process -f openshift/knative.service.cheats.yml \
-p BROKER_URL=$BROKER_URL | oc create -f -
oc process -f openshift/knative.service.alerts.yml \
-p SENDGRID_API_KEY=$SENDGRID_API_KEY \
-p EMAIL_FROM=$EMAIL_FROM \
-p EMAIL_TO=$EMAIL_TO | oc create -f -
- Select Add + from the side-menu in OpenShift DevSandbox.
- Find the KafkaSource using search, or under the Event Sources section. Select it and choose Create.
- Using the Form view set the following options for the KafkaSource:
. Bootstrap Servers - This can be auto-completed to your linked managed Kafka bootstrap URL.
. Topics: Enter the name
shots
. . SASL: Enable SASL and use the rh-cloud-services-service-account. Use the client-id for User, and the client-secret as the Password. . TLS: Enable TLS. Leave the TLS options at the defaults. . Sink: Choose the cheat-detection Knative Service. - Click Create to deploy the KafkaSource.
Finally, apply a Trigger that will subscribe the email alerting service to any audit failure events.
oc apply -f openshift/audit.trigger.yml
This Trigger will cause events of type audit.fail.bonus
in the Broker to be
sent to the email alerting service.
Instead of running the entire Shipwars game, use the bonus-producer included in this repository.
This requires Node.js 14 or later and a Service Account with producer permissions:
# Login to RHOAS
rhoas login
# Create a service account and store details in /tmp/producer-sa file
rhoas service-account create \
--short-description bonus-producer \
--output-file /tmp/producer-sa \
--file-format env
# Get the Client ID
export CLIENT_ID=$(cat /tmp/producer-sa | grep CLIENT_ID | awk -F '=' '{print $2}')
# Apply produce and consume permissions to the service account
rhoas kafka acl grant-access --producer --consumer \
--service-account $CLIENT_ID --topic shots --group all
# Provide credentials to Node.js application config dir
cat /tmp/producer-sa | grep CLIENT_ID | awk -F '=' '{print $2}' > /bonus-producer/.bindings/kafka/user
cat /tmp/producer-sa | grep CLIENT_SECRET | awk -F '=' '{print $2}' > /bonus-producer/.bindings/kafka/password
rhoas kafka describe | jq .bootstrap_server_host -r > /bonus-producer/.bindings/kafka/bootstrapServers
Start the producer locally:
cd bonus-producer
npm install
npm run dev:with-bindings
You can now send payloads to Kafka for cheat detection by sending HTTP GET
requests to http://localhost:8080/bonus
.