-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
cannot set w/ numeric key on an undefined nested object #59
Comments
With a deep Only options I can think of are either to create the nested object as desired so that inference doesn't need to happen, or have a post process that converts the array to an object. |
@planttheidea thanks for the tip - the workaround you described definitely works, but i end up w/ a bit more low level code than i'd like i could potentially see this kind of path w/ the dot notation potentially being an intuitive way to differentiate a deep set on an object w/ a numeric key: UC.set('0.1', 'foo', {}) understandably though such a change could end up breaking other user's code what are your thoughts on this? |
I did think it something else that may satisfy this: use an array of string keys instead of dot notation. UC.set(['0', '1'], 'foo', {}); When an array, no parsing is done, so the key remains a string. This should cue the underlying "new object inferrer" to use an object. I consider this an extreme edge case, and I'd rather not introduce a breaking change of far more common use cases to satisfy it. |
i'm trying to use
set
w/ a numeric key on a nested object that does not exist (i.e. it'sundefined
in the parent object)expected result:
actual result:
seems to yield an array by default when the key is numeric?
is there a way to specify the desired behavior and get the expected result?
v:
unchanged@2.2.1
The text was updated successfully, but these errors were encountered: