Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions logging/log_collection_forwarding/log-forwarding.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,8 @@ include::modules/cluster-logging-collector-log-forward-fluentd.adoc[leveloffset=

include::modules/cluster-logging-collector-log-forward-syslog.adoc[leveloffset=+1]

include::modules/cluster-logging-collector-log-forward-kafka.adoc[leveloffset=+1]

include::modules/cluster-logging-collector-log-forward-cloudwatch.adoc[leveloffset=+1]

=== Forwarding logs to Amazon CloudWatch from STS enabled clusters
Expand Down
20 changes: 13 additions & 7 deletions modules/cluster-logging-collector-log-forward-kafka.adoc
Original file line number Diff line number Diff line change
@@ -1,7 +1,13 @@
// Module included in the following assemblies:
//
// logging/log_collection_forwarding/log-forwarding.adoc

:_content-type: PROCEDURE

[id="cluster-logging-collector-log-forward-kafka_{context}"]
= Forwarding logs to a Kafka broker

You can forward logs to an external Kafka broker in addition to, or instead of, the default Elasticsearch log store.
You can forward logs to an external Kafka broker in addition to, or instead of, the default log store.

To configure log forwarding to an external Kafka instance, you must create a `ClusterLogForwarder` custom resource (CR) with an output to that instance, and a pipeline that uses the output. You can include a specific Kafka topic in the output or use the default. The Kafka output can use a TCP (insecure) or TLS (secure TCP) connection.

Expand Down Expand Up @@ -75,11 +81,11 @@ spec:
** Optional: String. One or more labels to add to the logs.
<14> Optional: Specify `default` to forward logs to the internal Elasticsearch instance.

. Optional: To forward a single output to multiple Kafka brokers, specify an array of Kafka brokers as shown in this example:
. Optional: To forward a single output to multiple Kafka brokers, specify an array of Kafka brokers as shown in the following example:
+
[source,yaml]
----
...
# ...
spec:
outputs:
- name: app-logs
Expand All @@ -91,15 +97,15 @@ spec:
- tls://kafka-broker1.example.com:9093/
- tls://kafka-broker2.example.com:9093/
topic: app-topic <3>
...
# ...
----
<1> Specify a `kafka` key that has a `brokers` and `topic` key.
<2> Use the `brokers` key to specify an array of one or more brokers.
<3> Use the `topic` key to specify the target topic that will receive the logs.
<3> Use the `topic` key to specify the target topic that receives the logs.

. Create the CR object:
. Apply the `ClusterLogForwarder` CR by running the following command:
+
[source,terminal]
----
$ oc create -f <file-name>.yaml
$ oc apply -f <filename>.yaml
----