Add setting to throw errors when duplicate keys are detected#228
Add setting to throw errors when duplicate keys are detected#228petey wants to merge 1 commit intonodeca:masterfrom
Conversation
|
Good to merge, but I'm not sure about the option name. Any ideas, @puzrin? By the way, I think it was done initially without this check because of tests we had. YAML 1.2 spec clearly states that duplicate keys MUST cause an error. The confusion must be came from YAML 1.1 spec, which (AFAIK) does not specify what to do with duplicate keys, and YAML 1.1 implementations (Ruby's for example), which treat duplicate keys just like JS-YAML currently does. I think we should actually fix our tests to conform YAML 1.2 rules and make the change this pull request adds the default. |
|
I'm more than happy to make any changes required to have errors thrown as the default behavior. There are a number of tests in the loader test suite that will fail, currently. |
|
No ideas about better names. I'd like to have this mode default on, but that seems to break compatibility with JSON. Can we off it automatically for { } childs ? |
|
My suggestion is to investigate, can we solve this problem without introducing new option - disable dupes for YAML, but enable for JSON. That would be the best solution for users IMHO. |
|
in JSON, you cannot have duplicate keys at the same level of an object. For example, this is invalid: There should be no difference here between JSON and YAML in this respect. |
|
So it will be correct to disable dupes completely and bump version to 3.5.0? |
|
@puzrin, YAML 1.2 is explicitly incompatible with JSON on this. It's stated in the spec. Actually, JSON RFC also has a statement for key uniqueness, but it says that keys SHOULD be unique (i.e. it's a recommendation for implementation, not a requirement), and YAML spec says that keys MUST be unique. In my opinion the best thing to do is to restrict duplicate keys globally by default, and add a |
I've tested: $ node -e 'console.log(JSON.parse("{\"a\":1,\"a\":2}"))'
{ a: 2 }Ok, agree that it's better to disable dupes by default and add @petey could you update PR? Also make sure moved tests are working with { json: true } |
|
I've added the json flag and updated tests. |
|
Squash to single commit please |
|
LGTM after description update. |
1520f8e to
470ea9e
Compare
|
OK, last thing. construct-merge test should not be removed. It's the only test for the merge "type"! This test should also ensure that mapping values are overridden without errors if both sides share some keys. |
|
I still have that test here: https://github.com/petey/js-yaml/blob/strictKeys/test/issues/0166.js#L37-L54 It will fail by default in loader due to duplicate keys. |
|
Ah, I see. Actually, you shouldn't do this. Just move UPD: I mean it is NOT a failing case. That test must pass with default loader settings. |
|
@dervus I tried that. |
|
Sorry, I was wrong. I've just refreshed my memories on !!merge type. This type is implemented as a hack on storeMappingPair-level (there was reasons to) and we need this to be slightly changed to handle both unique key policy and !!merge type behaviour. So, the rules are:
To implement this we need to keep the list of keys merged from other mappings to distinct them from "normal" keys. I think we'll have to keep this list at readBlockMapping/readFlowSequence-level. The sample file must work without changes. As you can see that sample was taken from the type spec. |
…lt. Allow JSON to be parsed with duplicate keys, and also YAML when json flag is true.
|
I'm afraid I've never been much good at language parsers, and I can't make heads nor tails of this code. I added a gross hack to make the !!merge test work, but it won't do what you are suggesting it should. If this isn't enough, I'm throwing in the towel. I'm not able to spend any more time trying to figure this out, and you can feel free to close this PR. |
|
@petey No problem, I'll fix !!merge by myself on the next weekend then. |
|
Resolved. |
It is a really poor experience for my users when their yaml is parsed, and due to the behavior of overriding content of duplicate keys, they do not get what they are expecting. This PR intents to add a flag that will force an error to be thrown when duplicate keys are detected at the same depth of an object. The current default behavior is not changed.
Addresses issue #166