Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

All Argo Workflows logs are marked as error in stackdriver (GCP) #5135

Closed
klaucos opened this issue Feb 18, 2021 · 5 comments
Closed

All Argo Workflows logs are marked as error in stackdriver (GCP) #5135

klaucos opened this issue Feb 18, 2021 · 5 comments
Labels

Comments

@klaucos
Copy link

klaucos commented Feb 18, 2021

Summary

Similar problem to closed unsolved issue #4471
On Google Cloud all logs generated by argo-server and workflow-controller can not be properly parsed. Meaning all logs show as ERROR logs in Google Stackdriver.
One of the reasons might be that all the logs are dumped to stderr so it's handled as error.
Also Stackdriver is using severity keyword instead of info to parse logging level.

Is there a way to define custom way of logging for argo-server and workflow-controller? Or change the default logging output to stdout?

Another problem is frequency of logging Enforcing history limit and leases it too high and it's polluting our logs.
Can I somehow define that I want to log only warning or higher in argo-server and workflow-controller?

Diagnostics

What Kubernetes provider are you using?:
Google Cloud platform kubernetes 1.17.15

What version of Argo Workflows are you running?
Tested with Argo Workflows installed by release v2.12.9 and also with v3.0.0-rc2

It can be reproduced by installing Argo on GCP with:
kubectl apply -n argo -f https://raw.githubusercontent.com/argoproj/argo/anyversion/manifests/install.yaml

Following logs keep poluting stackdriver every 5s or so.

time="2021-02-18T13:05:00.629Z" level=info msg="Enforcing history limit for 'workflow'" namespace=default workflow=workflow
time="2021-02-18T13:05:00.630Z" level=info msg="Enforcing history limit for 'workflow'" namespace=default workflow=workflow
time="2021-02-18T13:05:00.630Z" level=info msg="Enforcing history limit for 'workflow'" namespace=default workflow=workflow
time="2021-02-18T13:05:00.632Z" level=info msg="Enforcing history limit for 'workflow'" namespace=default workflow=workflow
time="2021-02-18T13:05:04.345Z" level=info msg="Get leases 200"
time="2021-02-18T13:05:04.361Z" level=info msg="Update leases 200"
time="2021-02-18T13:05:09.373Z" level=info msg="Get leases 200"
time="2021-02-18T13:05:09.379Z" level=info msg="Update leases 200"
time="2021-02-18T13:05:10.641Z" level=info msg="Enforcing history limit for 'workflow'" namespace=default workflow=workflow
time="2021-02-18T13:05:10.642Z" level=info msg="Enforcing history limit for 'workflow'" namespace=default workflow=workflow
time="2021-02-18T13:05:10.643Z" level=info msg="Enforcing history limit for 'workflow'" namespace=default workflow=workflow
time="2021-02-18T13:05:10.643Z" level=info msg="Enforcing history limit for 'workflow'" namespace=default workflow=workflow
time="2021-02-18T13:05:14.384Z" level=info msg="Get leases 200"
time="2021-02-18T13:05:14.392Z" level=info msg="Update leases 200"

Message from the maintainers:

Impacted by this bug? Give it a 👍. We prioritise the issues with the most 👍.

@stale
Copy link

stale bot commented Apr 21, 2021

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the wontfix label Apr 21, 2021
@stale stale bot closed this as completed Apr 30, 2021
@afoltzm
Copy link

afoltzm commented May 10, 2021

@klaucos I'm encountering this issue as well, but felt that it made more sense to raise for a discussion instead of re-opening these issues. If you'd like to add your thoughts, you can do so here: #5871

@seryl
Copy link

seryl commented Jan 17, 2022

Please consider reopening this; these types of logs should be in debug-only logging.

And also having them result in errors is definitely a surprise; please allow for stdout redirect with some sort of config.

@syedashrafulla
Copy link

I would also like if this was re-opened, as the logs are all sent to the cloud logging tool of choice and these logs are thus an extra cost on the deployment. I think the cost grows linear to usage too.

@tooptoop4
Copy link
Contributor

@klaucos did u solve 'frequency of logging Enforcing history limit' ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

5 participants