Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to achieve high throughput using otel kafka exporter #35289

Closed
Praveen2099 opened this issue Sep 19, 2024 · 4 comments
Closed

Unable to achieve high throughput using otel kafka exporter #35289

Praveen2099 opened this issue Sep 19, 2024 · 4 comments

Comments

@Praveen2099
Copy link

Praveen2099 commented Sep 19, 2024

Component(s)

No response

What happened?

Description

I am using below properties :-

Processor:
  batch:
    send_batch_size: 800
    send_batch_max_size: 900
    timeout: 1s

Exporter:
  kafka:
    brokers:
      - 'vip:nodePort'
    producer:
      max_message_bytes: 9000000
      compression: snappy
      required_acks: 1
      #flush_max_messages: 1000
    topic: testing.otlp.topic.new1
    sending_queue:
      enabled: true
      num_consumers: 10
      queue_size: 1000
    retry_on_failure:
      enabled: true
      initial_interval: 5s
      max_interval: 30s
      max_elapsed_time: 120s
    #topic_from_attribute: k8s.namespace.name
    encoding: otlp_json  # raw, otlp_proto
    protocol_version: 2.0.0

Kafka topic

Partition:6
Replica:3

Expected Result

100000 record per minute

Actual Result

1500 record per minute

So anyone can help me to give a configuration file of otel to achieve a higher throughout rate.

Collector version

v0.106.1

Environment information

Environment

OS: (e.g., "Ubuntu 20.04")
Compiler(if manually compiled): (e.g., "go 14.2")

OpenTelemetry Collector configuration

Processor:
  batch:
    send_batch_size: 800
    send_batch_max_size: 900
    timeout: 1s

Exporter:
  kafka:
    brokers:
      - 'vip:nodePort'
    producer:
      max_message_bytes: 9000000
      compression: snappy
      required_acks: 1
      #flush_max_messages: 1000
    topic: testing.otlp.topic.new1
    sending_queue:
      enabled: true
      num_consumers: 10
      queue_size: 1000
    retry_on_failure:
      enabled: true
      initial_interval: 5s
      max_interval: 30s
      max_elapsed_time: 120s
    #topic_from_attribute: k8s.namespace.name
    encoding: otlp_json  # raw, otlp_proto
    protocol_version: 2.0.0

Log output

No response

Additional context

No response

@Praveen2099 Praveen2099 added bug Something isn't working needs triage New item requiring triage labels Sep 19, 2024
Copy link
Contributor

Pinging code owners for exporter/kafka: @pavolloffay @MovieStoreGuy. See Adding Labels via Comments if you do not have permissions to add labels yourself.

@atoulme
Copy link
Contributor

atoulme commented Nov 2, 2024

How much memory and CPU did you have allocated to the process? Please try our troubleshooting guide, and in particular perform captures with the pprofextension to find out what is slowing things down.

@atoulme atoulme removed the needs triage New item requiring triage label Nov 2, 2024
Copy link
Contributor

github-actions bot commented Jan 2, 2025

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Jan 2, 2025
Copy link
Contributor

github-actions bot commented Mar 3, 2025

This issue has been closed as inactive because it has been stale for 120 days with no activity.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Mar 3, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants