Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add configurable maximum depth in ConvertFrom-Json with
-Depth
#8199Add configurable maximum depth in ConvertFrom-Json with
-Depth
#8199Changes from 7 commits
3f6391d
ba5ee60
7c6d50d
c71ff00
27c141f
541524e
eecc899
dc39854
File filter
Filter by extension
Conversations
Jump to
There are no files selected for viewing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wondering if a custom type
SerializerSettings
to encapsulate default 1024maxDepth
and falsereturnHashtable
here would be better than adding overloads.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if there are other settings we would want in the future, then that would be a better approach
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would consider the option to set
-Depth
less than0
for no maximum depth.0
or$Null
are often the result of a wrong calculation. Besides,0
could technically still be a desired "depth
".There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@iRon7
How so? Wouldn't that mean you want the deserialization to always fail? Is that really a use case? Also, the underlying
Newtonsoft.Json
does not allow 0 or less asMaxDepth
, it throws at runtime.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For a sample recurring hash table like:
A
$ht | ConvertTo-Json -Depth 1
results in:I was expecting for
$ht | ConvertTo-Json -Depth 0
to result in:But I indeed get an error:
Anyways, wouldn't it be better to get either a completely flat output or an error in case of a (incorrect calculated/unassigned/mistyped variable)
Depth
of0
or$Null
?I think a negative
Depth
would just be more knowingly chosen.In other words, I do not really expect it to be a desired used case, but in the case of a scripting failure, I think a flat result or an error is more clear then a possible performance issue.
Just a thought, the decision is up to you...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm still of the opinion that unlimited is not needed. 1 to int32.MaxValue is sufficient. Any JSON over Int32.MaxValue is either broken or contrived.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the real world, I expect the default of 1024 to be sufficient, so I would be ok with not needing unlimited.
-Depth 0
for only the top level makes sense and consistent withGet-ChildItem -Depth 0
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@iRon7 after thinking it through, I agree a negative value might be better than
0
.@markekraus I understand powershell needs to be secure by default but I think this is an opportunity to provide the option for people to just say "I know what I'm doing, get out of my way". Sure, you could have them type
-Depth:([int]::MaxValue)
, but feels less convenient in my opinion and it would still have the overhead (however small) of the serializer repeatedly checking to make sure max depth isn't exceeded (not the case if you passnull
).What do you guys think of having
-Depth -1
for unlimited and maybe throw an error onBeginProcessing()
if-Depth
is0
?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's not about security.. it's about absurdity. The only JSON objects that are that deep are contrived (made to be deep for fun or to specifically test depth limits) or broken (whatever produced the JSON had some issue). 1024 is already absurdly deep for a JSON object. We all ready catch outliers with the default setting. Between 1024 and Int32.MaxValue is a the realm of intrigue only. We don't need to have special case logic for that. We don't even really need infinite depth. What is the target audience for such a feature? Who really has objects that deep?
Just because we can, doesn't mean we should I still stand by my statement that 1 to int32.MaxValue is sufficient.