Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[receiver/k8s cluster receiver]add support for K8s leader extension into k8s cluster receiver #38429

Draft
wants to merge 16 commits into
base: main
Choose a base branch
from

Conversation

rakesh-garimella
Copy link
Contributor

Description

  • Add support for k8s leader elector to k8s cluster receiver so that multiple instances of k8s cluster receiver can be executed.
  • The instance which has the lease fetches the metrics.

example config:

extensions:
  k8s_leader_elector:
    auth_type: kubeConfig
    lease_name: foo
    lease_namespace: default
receivers:
  k8s_cluster:
    k8s_leader_elector: k8s_leader_elector
    node_conditions_to_report: [Ready, MemoryPressure]
    allocatable_types_to_report: [cpu, memory]
exporters:
  debug:
    verbosity: detailed

service:
  extensions: [k8s_leader_elector]
  pipelines:
    metrics:
      receivers: [k8s_cluster]
      exporters: [debug]
  telemetry:
    logs:
      level: info

Link to tracking issue

Fixes

Testing

Documentation

@github-actions github-actions bot requested a review from povilasv March 6, 2025 15:00
@rakesh-garimella rakesh-garimella marked this pull request as draft March 6, 2025 15:00
@rakesh-garimella rakesh-garimella changed the title [receiver/k8s leader elector]add support for K8s leader extension into k8s cluster receiver [receiver/k8s cluster receiver]add support for K8s leader extension into k8s cluster receiver Mar 6, 2025
@rakesh-garimella rakesh-garimella force-pushed the k8s_leader_support_for_k8s_cluster_receiver branch from 0eacdf0 to bc5304c Compare March 6, 2025 15:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants