Skip to content

Remove deprecated java-modules kafka, http and elasticsearch #197

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
May 29, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 0 additions & 5 deletions _data/navigation.yml
Original file line number Diff line number Diff line change
Expand Up @@ -379,11 +379,6 @@ admin-guide-nav:
url: /admin-guide/070_Destinations/081_http/002_Python_http_header_plugin
- title: "The Azure auth header plugin"
url: /admin-guide/070_Destinations/081_http/003_Azure_auth_header_plugin
- title: "http-java"
url: /admin-guide/070_Destinations/085_http-java/README
subnav:
- title: "HTTP Java destination options"
url: /admin-guide/070_Destinations/085_http-java/000_http_java_options
- title: "kafka"
url: /admin-guide/070_Destinations/100_Kafka-c/README
subnav:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,8 +33,8 @@ and other platforms, see {{ site.product.name }} installation packages.
- The {{ site.product.short_name }} application now uses PCRE-type regular
expressions by default. It requires the libpcre library package.

- If you want to use the Java-based modules of {{ site.product.short_name }} (for
example, the Elasticsearch, HDFS, or Kafka destinations), you
- If you want to use a Java-based module of {{ site.product.short_name }} (for
example, the HDFS), you
must compile {{ site.product.short_name }} with Java support.

- Download and install the Java Runtime Environment (JRE), 1.7
Expand All @@ -59,8 +59,7 @@ and other platforms, see {{ site.product.name }} installation packages.
3. If you want to post log messages as HTTP requests using the http()
destination, install the development files of the *libcurl* library.
This library is not needed if you use the \--disable-http compile
option. Alternatively, you can use a Java-based implementation of
the HTTP destination.
option.

4. If you want to use the spoof-source function of {{ site.product.short_name }}, install
the development files of the libnet library.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ compiling options.

- *\--enable-ipv6* Enable IPv6 support.

- *\--enable-java* Enable support for Java-based modules. For other
- *\--enable-java* Enable support for the Java-based module. For other
requirements, see the description of the Java-based module (for
example,
[[HDFS prerequisites|adm-dest-hdfs-pre]]) that you want to use.
Expand Down Expand Up @@ -134,7 +134,7 @@ compiling options.

- *\--with-libcurl* Specifies the path to the libcurl library. For
details on using this destination, see
[[http: Posting messages over HTTP without Java]].
[[http: Posting messages over HTTP]].

- *\--with-libhiredis* Specifies the path to the libhiredis library
(0.11 or newer). For details on using this destination, see
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,8 @@ description: >-
To send messages from {{ site.product.short_name }} to HDFS, complete the following steps.
---

1. If you want to use the Java-based modules of {{ site.product.short_name }} (for
example, the Elasticsearch, HDFS, or Kafka destinations), you must
1. If you want to use the Java-based module of {{ site.product.short_name }} (for
example, HDFS destination), you must
compile {{ site.product.short_name }} with Java support.

- Download and install the Java Runtime Environment (JRE), 1.7 (or
Expand Down
2 changes: 1 addition & 1 deletion doc/_admin-guide/070_Destinations/081_http/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ short_title: http
id: adm-dest-http
description: >-
Version 3.8 of {{ site.product.short_name }} can directly post log messages to web
services using the HTTP protocol, without having to use Java.
services using the HTTP protocol.
---

The current implementation has the following limitations:
Expand Down

This file was deleted.

55 changes: 0 additions & 55 deletions doc/_admin-guide/070_Destinations/085_http-java/README.md

This file was deleted.

Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,6 @@ description: >-
**Declaration**

```config
@define kafka-implementation kafka-c

kafka(
bootstrap-servers("1.2.3.4:9092,192.168.0.2:9092")
topic("{MYTOPIC}")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,9 +15,7 @@ description: >-
- The kafka-bootstrap-servers() option has been renamed
bootstrap-servers().

- The properties-file() is a Java properties file with options that
are similar to, but not identical with, the options in the old, Java
implementation's properties-file(). For more information, click here. TODO
- The properties-file() was replaced with the config() option of kafka-c, which is similiar but not identical to the properties-file() option.

- The sync-send() option has been deprecated. Remove it from the
configuration file.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,14 +10,7 @@ directly publish log messages to the Apache Kafka message bus, where subscribers

## Required options

The following options are required: bootstrap-servers(), topic(). Note
that to use the C implementation of the kafka() destination, you must
add the following lines to the beginning of your {{ site.product.short_name }}
configuration:

```config
@define kafka-implementation kafka-c
```
The following options are required: bootstrap-servers(), topic().

{% include doc/admin-guide/notes/kafka-c.md %}

Expand Down Expand Up @@ -80,8 +73,7 @@ The programming language accepts this option for better compatibility.
| Type:| |
|Default:| |

*Description:* You can use this option to expand or override the options
of the properties-file().
*Description:* You can use this option to set the properties of the kafka producer.

{% include doc/admin-guide/notes/kafka-c.md %}

Expand Down Expand Up @@ -154,31 +146,6 @@ client about the amount of messages sent since the last poll-timeout ().
In case of multithreading, the first {{ site.product.short_name }} worker is responsible for
poll-timeout().

## properties-file()

| Type:| string (absolute path)|
|Default:| N/A|

*Description:* The absolute path and filename of the Kafka properties
file to load. For example,
properties-file(\"/opt/syslog-ng/etc/kafka_dest.properties\"). The
{{ site.product.short_name }} application reads this file and passes the properties to
the Kafka Producer.

The {{ site.product.short_name }} kafka destination supports all properties of the
official Kafka producer. For details, see the librdkafka documentation.

The bootstrap-servers option is translated to the bootstrap.servers
property.

For example, the following properties file defines the acknowledgment
method and compression:

```config
acks=all
compression.type=snappy.
```

{% include doc/admin-guide/notes/kafka-c.md %}

{% include doc/admin-guide/options/retries.md %}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -109,9 +109,6 @@ An example output:

>center.received.stats.processed
>center.queued.stats.processed
>destination.java.d_elastic#0.java_dst(ElasticSearch,elasticsearch-syslog-ng-test,t7cde889529c034aea9ec_micek).stats.>dropped
>destination.java.d_elastic#0.java_dst(ElasticSearch,elasticsearch-syslog-ng-test,t7cde889529c034aea9ec_micek).stats.>processed
>destination.java.d_elastic#0.java_dst(ElasticSearch,elasticsearch-syslog-ng-test,t7cde889529c034aea9ec_micek).stats.>queued
>destination.d_elastic.stats.processed
>source.s_tcp.stats.processed
>source.severity.7.stats.processed
Expand Down Expand Up @@ -169,9 +166,6 @@ the query, and their values.
For example, the destination query lists the configured destinations,
and the metrics related to each destination. An example output:

>destination.java.d_elastic#0.java_dst(ElasticSearch,elasticsearch-syslog-ng-test,t7cde889529c034aea9ec_micek).stats.dropped=0
>destination.java.d_elastic#0.java_dst(ElasticSearch,elasticsearch-syslog-ng-test,t7cde889529c034aea9ec_micek).stats.processed=0
>destination.java.d_elastic#0.java_dst(ElasticSearch,elasticsearch-syslog-ng-test,t7cde889529c034aea9ec_micek).stats.queued=0
>destination.d_elastic.stats.processed=0

The syslog-ng-ctl query get command has the following options:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -245,7 +245,7 @@ listed below.
|Name |Description
|---|---
|amqp()| Publishes messages using the AMQP (Advanced Message Queuing Protocol).
|elasticsearch2| Sends messages to an Elasticsearch server. The elasticsearch2 driver supports Elasticsearch version 2 and newer.
|elasticsearch-http| Sends messages to an Elasticsearch server.
|file()| Writes messages to the specified file.
|graphite()| Sends metrics to a Graphite server to store numeric time-series data.
|graylog2()| Sends syslog messages to Graylog.
Expand Down