Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cleaner Scalyr logging support in latest scalyr-agent release #620

Closed
cyburgee opened this issue Oct 2, 2017 · 6 comments
Closed

Cleaner Scalyr logging support in latest scalyr-agent release #620

cyburgee opened this issue Oct 2, 2017 · 6 comments

Comments

@cyburgee
Copy link

cyburgee commented Oct 2, 2017

The latest release of the scalyr agent (https://github.com/scalyr/scalyr-agent-2/tree/v2.0.29) has better support for the json logs that come out of docker.

In their words:

Support to parse log lines written as a JSON object to extract line content before sending to Scalyr. You may turn this on by setting parse_lines_as_json to true in your log configuration stanza for a particular file. This is useful when uploading raw Docker logs.

It would be awesome if the images used by the logging-agent were updated to use that version. Also it would be nice if I could access the source for registry.opensource.zalan.do/eagleeye/scalyr-agent so I could help out myself.

@szuecs
Copy link
Member

szuecs commented Oct 3, 2017

Thanks for your suggestion we will check the new version.

@mohabusama, @apfeiffer85, @fmueller can you have a look?
Thanks!

@cyburgee
Copy link
Author

cyburgee commented Oct 3, 2017

In case losing data is a cause for concern, according to one of the support reps in an email to me:

By default, that will extract out the "log" field from the JSON objects in the log file and upload that as the log line content. It also adds in whatever attributes are included in the JSON object, such as the stream field which indicates if the message was written to stdout or stderr.

Anyways, there's no need to add that parse_lines_as_json to the config by default, but I thought you might want that bit of information if you're debating whether to do it or not.

@mikkeloscar
Copy link
Contributor

In case losing data is a cause for concern

What do you mean by this? Just that we would lose the surrounding json structure? This would not be a problem IMO as it just makes it simpler to create custom parses when you don't have to worry about parsing json first.

Thanks for letting us know about this!

@cyburgee
Copy link
Author

cyburgee commented Oct 3, 2017

I just meant that the description from the release notes didn't indicate that you would still collect all the fields from the json log like "stream"

@mohabusama
Copy link
Contributor

Thanks @cyburgee

Already created an issue here: zalando-incubator/kubernetes-log-watcher#49

@mikkeloscar @szuecs

@szuecs
Copy link
Member

szuecs commented Oct 13, 2018

Closing, because this repository does not own this component. Please check the referenced issue zalando-incubator/kubernetes-log-watcher#49

@szuecs szuecs closed this as completed Oct 13, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants