Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

std.json doesn't parse duplicate keys #3992

Open
dlangBugzillaToGithub opened this issue Dec 20, 2021 · 6 comments
Open

std.json doesn't parse duplicate keys #3992

dlangBugzillaToGithub opened this issue Dec 20, 2021 · 6 comments

Comments

@dlangBugzillaToGithub
Copy link

Răzvan Ștefănescu reported this on 2021-12-20T07:07:00Z

Transferred from https://issues.dlang.org/show_bug.cgi?id=22612

CC List

  • basile-z
  • Salih Dincer

Description

auto j = parseJSON(`{ "key": 1, "key" : 2 }`);
writeln(j); //outputs only {"key":2}

Of course, j["key"] contains 2, value 1 gets lost in translation.

According to ECMA-404:

"The JSON syntax does not impose any restrictions on the strings used as names, does not require that name strings be unique, and does not assign any significance to the ordering of name/value pairs"
@dlangBugzillaToGithub
Copy link
Author

salihdb commented on 2021-12-20T09:14:40Z

Probably, but the 2nd object with the same name is overwritten with the 1st.  In summary, data points to a single object.

NodeJS has the same effect:

const jsonStr ='{"name":"Salih","age":42,"name":"SALIH"}';
const data = JSON.parse(jsonStr);

var assert = require('assert');
assert(data.name != "Salih");

@dlangBugzillaToGithub
Copy link
Author

b2.temp commented on 2021-12-20T10:48:42Z

the spec quoted is clearly only about syntax, not about semantics of conflicted keys. off-topic but yaml is better on that aspect.

@dlangBugzillaToGithub
Copy link
Author

rumbu commented on 2021-12-20T15:03:57Z

(In reply to Salih Dincer from comment #1)
> Probably, but the 2nd object with the same name is overwritten with the 1st.
> In summary, data points to a single object.
> 
> NodeJS has the same effect:
> 
> const jsonStr ='{"name":"Salih","age":42,"name":"SALIH"}';
> const data = JSON.parse(jsonStr);
> 
> var assert = require('assert');
> assert(data.name != "Salih");

I don't expect anything else from JavaScript :)

I would never post this if I didn't encounter such json in the wild. Initially I had the same reaction, but the client providing that json stream pointed me to the ECMA standard and I lost all my arguments. I know that in the corresponding RFC says that you SHOULD not have duplicate keys, but SHOULD is interpreted by some people as "CAN".

There are npm packages even for node.js which can handle duplicate keys. The ideea is that as long as duplicate keys are allowed in the standard, at least std.json can provide in the documentation that last value wins (there are other approaches - first value wins, error, silently transform duplicate keys in arrays).

@dlangBugzillaToGithub
Copy link
Author

dfj1esp02 commented on 2021-12-20T15:29:23Z

A documentation issue, right?

@dlangBugzillaToGithub
Copy link
Author

rumbu commented on 2021-12-20T16:34:11Z

(In reply to anonymous4 from comment #4)
> A documentation issue, right?

Yes, if we assume that std.json is not ECMA compliant.

I don't understand why stdx.data.json didn't replace already the old std.json. It can handle also my case of duplicate keys since the parser is public.

@dlangBugzillaToGithub
Copy link
Author

salihdb commented on 2021-12-20T17:17:50Z

> ... at least std.json can
> provide in the documentation
> that last value wins ...
You're right...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant