Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Duplicate keys from JSON encoder #81

Closed
campbellr opened this issue Jun 16, 2016 · 6 comments
Closed

Duplicate keys from JSON encoder #81

campbellr opened this issue Jun 16, 2016 · 6 comments

Comments

@campbellr
Copy link

I noticed that if I do something like this:

logger := zap.NewJSON()
logger.Info("dupe fields", zap.String("foo", "bar"), zap.String("foo", "baz"))

I end up getting a JSON record like this:

{"msg":"dupe fields","level":"info","ts":1466086541235126983,"fields":{"foo":"bar","foo":"baz"}}

Now, the above example probably doesn't make a lot of sense, but it is more realistic if you consider an application that passes loggers around, with various layers adding context with "zap.Logger.With`.

I can't find anything that says duplicate fields are against the spec, and it doesn't seem to break json.Unmarshal or Python's json.loads which both end up taking the last "foo":"baz", but it still might be worth handling (or even just documenting).

@akshayjshah
Copy link
Contributor

Yep, this is definitely something we should document. All the unmarshalers that I've been able to get my hands on do exactly what you've described and keep the last value.

@akshayjshah akshayjshah added this to the Release 1.0 milestone Jun 29, 2016
@prashantv
Copy link
Collaborator

There's a good article about this on Stack Overflow:
http://stackoverflow.com/questions/21832701/does-json-syntax-allow-duplicate-keys-in-an-object

The tldr is: RFC 2119 says "SHOULD be unique", which means it's strongly recommended, but it's not a "MUST". However, implementations may still throw errors for duplicate fields.

We should definitely document that the user is responsible for avoiding duplicate fields.

Throwing out an idea regarding passing loggers around: Currently a logger always has a reference to the top level fields, what if you passed a logger that only had a reference to a nested object,

logger := New(...)
_ = newConnection(logger.Nested("conn"))
_ = newRequest(logger.Nested("request"))

The nested logger would not be able to add anything to the top level fields, so you could pass in the logger to other components (internal or external) without having to worry about clashes.

@akshayjshah
Copy link
Contributor

Added a more explicit comment about this in #120, but leaving this issue open while we consider @prashantv's suggestion.

@akshayjshah akshayjshah changed the title jsonEncoder and duplicate keys Duplicate keys from JSON encoder Aug 2, 2016
@prashantv
Copy link
Collaborator

Created a separate issue for the nested logger: #132

Closing this issue since we don't have plans to enforce unique field names in the logger. We can make it easier using nested logging, but the user is still responsible for ensuring there's no field clashes.

@campbellr
Copy link
Author

Thanks. The documentation helps, and nested loggers sound like a good compromise.

@taylorfutral
Copy link

taylorfutral commented Feb 9, 2023

For anyone who was confused by this like I was, remember that the parent logger is unaffected by the child logger created by .With(). So you can create a child logger, write a log (ex. .Info(), .Error()) and then start over with a new child logger to prevent duplicates. Example below:
https://go.dev/play/p/DLwIdXLgmco

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

4 participants