-
Notifications
You must be signed in to change notification settings - Fork 418
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use safe_merge to handle multiple definitions of a single field #821
Use safe_merge to handle multiple definitions of a single field #821
Conversation
Actually the goal is specifically to allow that. A few ways this can be used are:
- name: file
reusable:
expected:
- my_custom_stuff |
Hmm, I think we might be talking about different things. I believe both of the cases you mentioned should still work correctly even with this change. The scenario I'm talking about is this: Let's say I have a custom schema for Intentional custom schema
Accidental custom schema
subset file:
If I run the generator script without the change, this template will be produced:
It choses the
What the change is trying to prevent is a user (or ecs core) accidentally defining the top level |
Ok I misunderstood. Could you add unit tests for the following cases, please?
|
Yep will do! |
@jonathan-buttner I haven't forgotten this. However I'm working on a massive PR that touches some of this code. I may just get this change in my PR. This is a good additional safeguard. |
Thanks @webmat sounds good! |
@elasticmachine, run elasticsearch-ci/docs |
Looks these changes were merged as part of |
This PR addresses a potential issue where a user defining their own schema files accidentally defines a custom extension for the same field twice. For example in endpoint we extend the
file
schema:If we had another file that redefines fields for
file
, the actual type used would depend on which schema file was read last by the ecs generator.This probably won't happen in practice, but I figured it was worth pointing out just in case.
With this change, it will throw an error indicating that
file
was found more than once: