Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't get metrics despite following instructions #117

Open
khachou opened this issue Jun 17, 2022 · 5 comments
Open

Can't get metrics despite following instructions #117

khachou opened this issue Jun 17, 2022 · 5 comments

Comments

@khachou
Copy link

khachou commented Jun 17, 2022

I followed all the steps but can't get the metrics.Any help @danihodovic

@danihodovic
Copy link
Owner

🤷

@hammady
Copy link

hammady commented Mar 9, 2023

No seriously, a troubleshooting guide will be helpful. I'm using the latest helm chart.
For me, I made sure the celery workers send task events:

celery -A <app> control enable_events
Prod Environment
Debug: False
Using prod DB.
->  celery@app-8a0f19073c2f58c1-74b98d5d65-tzhk5: OK
        task events already enabled
->  celery@app-8a0f19073c2f58c1-74b98d5d65-lqbmw: OK
        task events already enabled
->  celery@app-8a0f19073c2f58c1-74b98d5d65-97kw7: OK
        task events already enabled
->  celery@app-8a0f19073c2f58c1-74b98d5d65-9hj9d: OK
        task events already enabled

The broker is redis. The exporter can connect to the broker. Tasks are run. When running the below inside the exporter container:

curl localhost:9808/metrics

All metrics are empty (not even 0) except the queue size returning 0:

# HELP celery_task_sent_total Sent when a task message is published.
# TYPE celery_task_sent_total counter
# HELP celery_task_received_total Sent when the worker receives a task.
# TYPE celery_task_received_total counter
# HELP celery_task_started_total Sent just before the worker executes the task.
# TYPE celery_task_started_total counter
# HELP celery_task_succeeded_total Sent if the task executed successfully.
# TYPE celery_task_succeeded_total counter
# HELP celery_task_failed_total Sent if the execution of the task failed.
# TYPE celery_task_failed_total counter
# HELP celery_task_rejected_total The task was rejected by the worker, possibly to be re-queued or moved to a dead letter queue.
# TYPE celery_task_rejected_total counter
# HELP celery_task_revoked_total Sent if the task has been revoked.
# TYPE celery_task_revoked_total counter
# HELP celery_task_retried_total Sent if the task failed, but will be retried in the future.
# TYPE celery_task_retried_total counter
# HELP celery_worker_up Indicates if a worker has recently sent a heartbeat.
# TYPE celery_worker_up gauge
# HELP celery_worker_tasks_active The number of tasks the worker is currently processing
# TYPE celery_worker_tasks_active gauge
# HELP celery_task_runtime Histogram of task runtime measurements.
# TYPE celery_task_runtime histogram
# HELP celery_queue_length The number of message in broker queue.
# TYPE celery_queue_length gauge
celery_queue_length{queue_name="celery"} 0.0
# HELP celery_active_consumer_count The number of active consumer in broker queue.
# TYPE celery_active_consumer_count gauge

Container logs:

2023-03-09 04:27:49.285 | INFO     | src.exporter:run:208 - Setting celery accept_content ['json', 'pickle']                             │
2023-03-09 04:27:49.489 | INFO     | src.http_server:start_http_server:66 - Started celery-exporter at port='9808'

It is unclear to me which settings should be put in the exporter other than the below:

│   - env:                                                                                                                                 │
│     - name: CE_BROKER_URL                                                                                                                │
│       value: redis://*****:6379/0                                                       │
│     - name: CE_ACCEPT_CONTENT                                                                                                            │
│       value: json,pickle                                                                                                                 

@hammady
Copy link

hammady commented Mar 10, 2023

@danihodovic It was caused by the serializer for events (tasks & results) being pickle. Once changed to json, it started reporting right away. Is this a bug/limitation of the exporter or a just missing documentation?

@abiwill
Copy link

abiwill commented Jun 16, 2023

@hammady can you please share the changes. I am facing similar issue with the latest release of exporter.

@mihirego
Copy link

May be late to comment, but this worked for me

app.conf.worker_send_task_events = True
app.conf.task_send_sent_event = True
app.conf.task_serializer = 'json'
app.conf.event_serializer = 'json'

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants