Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add configurable maximum depth in ConvertFrom-Json with `-Depth` #8199

Merged
merged 8 commits into from Feb 20, 2019

Conversation

@louistio
Copy link
Contributor

louistio commented Nov 7, 2018

PR Summary

The ConvertFrom-Json cmdlet is currently limited to deserializing json objects with a maximum depth of 1024. An exception is thrown if the input's depth reaches that maximum.

This PR:

  • Adds an optional -Depth parameter to the cmdlet which lets the user to specify a maximum depth allowed for deserialization, which will overwrite the default maximum of 1024.

Closes #3182

PR Checklist

Copy link
Collaborator

iSazonov left a comment

We need PowerShell Committee approvment:

  • Add new parameter -MaxDepth, should it be int or int? ?
  • Add new overload
public static object ConvertFromJson(string input, bool returnHashtable, int? maxDepth, out ErrorRecord error)

/cc @SteveL-MSFT

/// </summary>
[Parameter]
[ValidateRange(ValidateRangeKind.Positive)]
public int? MaxDepth { get; set; }

This comment has been minimized.

Copy link
@iSazonov

iSazonov Nov 7, 2018

Collaborator

I think it should be:

Suggested change
public int? MaxDepth { get; set; }
public int MaxDepth { get; set; } = int.MaxValue;

This comment has been minimized.

Copy link
@vexx32

vexx32 Nov 7, 2018

Collaborator

Don't think we have any default cmdlets with nullable input types, so I don't see any need for that. I would think that since it's a MaxDepth parameter we could either:

  1. Opt for uint (because negative depth is insensible) with a maximum set even perhaps at something comparable to the ConvertTo-Json command which has 2 as its default -MaxDepth, or
  2. Simply set the default to -1 and have the code paths treat that as "no maximum depth".

This comment has been minimized.

This comment has been minimized.

Copy link
@louistio

louistio Nov 7, 2018

Author Contributor

Thank you for the feedback, as this is my first feature contribution, I wasn't sure how to implement everything 😅

@iSazonov your suggestion would mean there would be no way to differentiate no maximum depth and the user wanting a maximum depth of 2147483647. I think the value representing no maximum depth should really be outside of the user input range for maximum depth (1 - 2147483647)!

@vexx32 I could do a uint with range of 1 - int.MaxValue and use 0 as default value representing no maximum. I could also do int with -1 as default value representing no maximum. If that's what the team prefers, I don't mind, I was just using null as a way to represent the parameter not being specified. 😄

This comment has been minimized.

Copy link
@vexx32

vexx32 Nov 7, 2018

Collaborator

Yeah, you have a point there; a maximum depth of 0 is also pretty silly. However, given that the underlying json API uses int for these parameters, as @iSazonov pointed out, we can't safely use uint as it would allow the user to specify values that the underlying API can't handle.

I think we need some clarification on whether the json API itself has a defined specification for "no maximum depth" and what that would look like.

@markekraus I know you worked on Invoke-RestMethod a good bit, do you have any insight to offer on the json API that might help here?

This comment has been minimized.

Copy link
@louistio

louistio Nov 7, 2018

Author Contributor

@vexx32 I meant I could do uint with [ValidateRange(1, int.MaxValue)], effectively not allowing user input values outside the underlying API. (but we'll have to cast the value so it might just be better to use int directly?)

Also, I was proposing 0 because the underlying API doesn't allow 0 for MaxDepth, only 1 - int.MaxValue (it throws an exception at runtime, see here).

This comment has been minimized.

Copy link
@vexx32

vexx32 Nov 7, 2018

Collaborator

Oh, that's just funny at that point. If you're going to throw that early... why not just make it uint haha!

This comment has been minimized.

Copy link
@markekraus

markekraus Nov 9, 2018

Collaborator

The underlying type is an int, so we should probably accept an int with the [ValidateRange(1, int.MaxValue)]. realistically, an JSON with a depth greater than int32.MaxValue is probably bad JSON to begin with.... A nullable type for this would be weird. accepting -1 is also weird.

@SteveL-MSFT

This comment has been minimized.

Copy link
Member

SteveL-MSFT commented Nov 7, 2018

@PowerShell/powershell-committee reviewed this, the design should be that there is a -Depth parameter (which aligns with ConvertTo-Json) which defaults to 1024 and the user can change as needed. This is to preserve existing behavior.

@louistio

This comment has been minimized.

Copy link
Contributor Author

louistio commented Nov 8, 2018

@SteveL-MSFT with all due respect, may I ask to reconsider this decision? The -Depth parameter in ConvertTo-Json does not have the same behavior as the -MaxDepth I implemented here:

-Depth in ConvertTo-Json will quietly suppress conversion after the depth level specified (2 by default), for example:

PS C:\> ConvertTo-Json (ConvertFrom-Json '{"1":{"2":{"3":{"4":{"5":1}}}}}')
{
  "1": {
    "2": {
      "3": "@{4=}"
    }
  }
}

-MaxDepth as implemented here strictly fails the conversion if the depth of the json exceeds the value, the same way ConvertFrom-Json currently fails if the depth is higher than 1024 and ConvertTo-Json currently fails if the depth is higher than 100. With this in mind, -MaxDepth could also be implemented for ConvertTo-Json and is not equivalent to its -Depth parameter.

Furthermore, I believe the current -Depth parameter of ConvertTo-Json is not intuitive and has caused a lot of confusion for users in the past (there are lots of threads about it on StackOverflow, for example). What json serializer do you expect to stop serializing at third depth level by default? This leads to the user having to know beforehand what depth his input has in order for conversion to work properly (an unrealistic thing to expect). I don't think we should model new parameters aligning with it.

I also think putting 1024 as the default is a bad idea. This would make it so there is no way to say "I don't want a maximum depth" (which is the most common scenario). It would also lead to the same problem ConvertTo-Json currently has: we expect the user to know beforehand what is the depth of his input.

I know it would be a breaking change to have no maximum be the new default, but is it really a common scenario that someone is relying on the default behavior failing with input of depth higher than 1024? I think handling higher depth with this new parameter is the perfect opportunity to change the default behavior to something more intuitive: the cmdlets should handle whatever is thrown at them, if the user expects the input to cause a DoS in their system (which I assume was the reason for this default max depth), then they should specify a maximum depth optionally.

@iSazonov

This comment has been minimized.

Copy link
Collaborator

iSazonov commented Nov 8, 2018

So we should discuss:

  • Depth vs MaxDepth
  • Defaults
  • Nullable type for the parameter

My thoughts:

  • Depth name looks good to be consistent with ConvertTo-Json.
  • Current default is 2 in ConvertTo-Json. What is history why we use the value? Seems for ConvertFrom-Json the default should be larger. And defaults for both cmdlets should be the same for better UX: $source | ConvertFrom-Json | ConvertTo-Json is expected to return $source.
  • I believe nullable type has no sense in scripts.

/cc @markekraus @mklement0 What do you think?

@SteveL-MSFT

This comment has been minimized.

Copy link
Member

SteveL-MSFT commented Nov 8, 2018

@louistio to be clear, the decision regarding -Depth is simply the name of the parameter not the semantics. At the time of discussion, we were not aware of the limitation of "maxdepth" would fail the entire operation rather than serialize up to 1024. This (to me) seems broken, but would be a breaking change. In light of that information, it's worth discussing again the the committee to see if there should be two parameters (which would be confusing to the user) or accept a breaking change at the limit.

It would also be useful if you can provide examples where the current limit of 1024 is insufficient.

@louistio

This comment has been minimized.

Copy link
Contributor Author

louistio commented Nov 8, 2018

@SteveL-MSFT Thank you! To be clear the naming is not a big concern to me, it's more about the bad user experience of having a default that I described later in my post. 😄

@joeyaiello

This comment has been minimized.

Copy link
Member

joeyaiello commented Nov 15, 2018

@PowerShell/powershell-committee had a very bikeshed conversation about this, and we understand the parallel concepts argument with this rough visualization:

ConvertTo-Json          -Depth                               (MaxDepth is internal as 100)
                   (produces partial)
ConvertFrom-Json        [-Depth]                             (MaxDepth is internal as 1024 and settable)
   (doesn't make sense because we don't have types)          (could be called -MaxDepth, could be called -Depth for PS consistency)

Does anyone else actually care?

@rjmholt

This comment has been minimized.

Copy link
Member

rjmholt commented Nov 15, 2018

Two cents:

  • The MaxDepth internal value in ConvertTo-Json must have a low default because PowerShell must be secure by default -- omitting a parameter shouldn't open up a DoS attack, but setting it to a high (or negative - to turn off) value => you're on your own
  • MaxDepth makes sense as a settable parameter that a user can specify to avoid errors with large objects
  • Exposing that, it should be symmetric with anything on ConvertFrom-Json where symmetry makes sense
@iSazonov

This comment has been minimized.

Copy link
Collaborator

iSazonov commented Nov 15, 2018

@markekraus @mklement0 Have you any thoughts about consistency of Depth parameters?

@iSazonov iSazonov added the CL-General label Nov 16, 2018
@markekraus

This comment has been minimized.

Copy link
Collaborator

markekraus commented Nov 18, 2018

I agree with @rjmholt

@iSazonov

This comment has been minimized.

Copy link
Collaborator

iSazonov commented Nov 18, 2018

@joeyaiello @SteveL-MSFT Seems we can make final conclusion and continue the PR.

@mklement0

This comment has been minimized.

Copy link
Contributor

mklement0 commented Nov 18, 2018

In terms of naming, -Depth is currently consistently being used to denote a maximum recursion depth.

Regrettably, exceeding that maximum depth doesn't cause an error in ConvertTo-Json, but results in near-useless .psobject.ToString() serialization of property values that exceed the depth.
In combination with the low default -Depth value of 2, that makes for frequent user frustration, as @louistio notes - the current ConvertTo-Json behavior frequently amounts to quiet de-facto failure.

In a utopian world, unencumbered by backward-compatibility concerns:

  • Having a high-enough default that accommodates most use cases, yet prevents DoS attacks, as @rjmholt suggests, makes sense.

  • Exceeding that default must fail.

  • You should be able to opt out of the default fail-if-exceeded limit with an explicit -MaxDepth value, again with failure if exceeded; to avoid the awkwardness of -MaxDepth ([int]::MaxValue) or -MaxDepth -1, an additional -NoMaxDepth switch could be considered, if really needed.

  • You should be able to intentionally cut off recursion with -Depth, similar to how Get-ChildItem uses the parameter: while there may be additional levels, you choose to ignore them.

@louistio

This comment has been minimized.

Copy link
Contributor Author

louistio commented Nov 18, 2018

@rjmholt Fair point about powershell being secure by default.

@mklement0 I agree with everything you're saying, I think you basically put into words what I was struggling to convey. I like the idea of another switch for no -NoMaxDepth, more elegant than another invalid value!

I would like to implement an equivalent to -MaxDepth for ConvertTo-Json too, to allow users to change the current default 100 value. If we're naming this parameter -Depth, it would conflict with the current ConvertTo-Json parameter, unless the team decides to break its backwards compatibility and make it explicitly fail from now on. In my opinion this is the best option because the current behavior of that parameter is a pain for powershell users for the reasons I explained above but I understand the backward compatibility concerns.

@markekraus

This comment has been minimized.

Copy link
Collaborator

markekraus commented Nov 19, 2018

@louistio What would be the difference between -Depth and -MaxDepth on ConvertTo-Json

@louistio

This comment has been minimized.

Copy link
Contributor Author

louistio commented Nov 19, 2018

@markekraus As mentioned above, the behavior difference resides in the cmdlet failing on input exceeding the maximum depth. -Depth silently suppresses exceeding depth (soft) and -MaxDepth fails the execution (hard).

Currently ConvertTo-Json has a default soft depth of 2 (configurable with -Depth) and default hard max depth of 100 (non configurable). ConvertFrom-Json has no concept of soft max depth and a default hard max depth of 1024 (non configurable - what this PR adds).

What I'm saying is I'd be fine with changing ConvertTo-Json's -Depth into a hard max depth but I know this is compatibility breaking.

@markekraus

This comment has been minimized.

Copy link
Collaborator

markekraus commented Nov 19, 2018

I think it would be confusing to have both Depth and Max Depth. We should have one for the depth and one to adjust the behavior of that depth.

@mklement0

This comment has been minimized.

Copy link
Contributor

mklement0 commented Nov 19, 2018

True, having both -MaxDepth and -Depth could be confusing; on further reflection, all we need is
-Depth
:

-Depth can be the opt-in, I-know-that-many-levels-is-all-I-need-or-I-really-want-that-many-levels parameter.

(What is conceptually) -MaxDepth needn't be user-facing (but needs to be documented): it can a fixed, documented limit that is chosen high enough to (a) accommodate most use cases while (b) preventing DoS attacks - and exceeding that limit causes an error.

-Depth then does double duty:

  • You use it for data whose depth falls below the documented, fixed max. depth, in case you intentionally want to cut it off at the specified depth.

  • You also use it for data that you do want whose depth happens to exceed the internal (but documented) max. depth that you do want to serialize/deserialize in full nonetheless.

I personally then don't see a need for any other parameters - I think opting into levels beyond the fixed limit with an explicit number or even with -Depth ([int]::MaxValue) (without the need for an abstract -NoMaxDepth switch) strikes me as acceptable, because I presume this to be a rare use case.

In the absence of using -Depth in a given command:

  • You either live happily ever after, given that most commands won't hit the fixed limit.

  • You are alerted to a problem through the error that is reported if your data unexpectedly exceeds the fixed limit (in which case you'll need to use -Depth, if using that many levels is really your intent).

As for required changes and backward compatibility:

  • Remove the default -Depth of 2 from ConvertTo-Json (from what I understand, it already has a fixed, internal limit of 100 beyond which it fails).

  • Document the fixed limits.

  • The only way in which existing code could be impacted is in that payloads generated with ConvertTo-Json could now increase, if users previously implied on the implicit cut-off at depth 2 - that strikes me as a Bucket 3: Unlikely Grey Area change.

@markekraus

This comment has been minimized.

Copy link
Collaborator

markekraus commented Nov 19, 2018

There are 2 things: 1) The maximum depth you want to convert and 2) the behavior of that conversion. They both need to be configurable, IMO and just relying on a single param for both would be a pain.

There are a very large number of objects passed to ConvertTo-Json where only having a max depth that errors would make them impossible objects to convert. We need the ability to squash. This includes object types that have parent child mappings. There would no way to convert those without squashing.

Consider the Current behavior:

cmdlet Max depth Default Depth Depth Configurable Default Behavior Behavior Configurable
ConvertTo-Json 100 2 yes Silently Squash no
ConvertFrom-Json 1024 1024 no Error no

I would ultimately like to see it as this:

cmdlet Max depth Default Depth Depth Configurable Default Behavior Behavior Configurable
ConvertTo-Json Int32.MaxValue 2 yes Error yes
ConvertFrom-Json Int32.MaxValue 1024 yes Error yes

Ideally we would would want a -Depth param that sets the max depth and an -Squash ( or something). we still need default depths (2 for ConvertTo-Json and 1024 for ConverFrom-Json). it would be nice to add the ability to change the behavior of ConvertTo-Json, but that's a bit out of scope for this PR. It would also be nice to be able to squash with ConvertFrom-Json but that i also a bit out of scope for this PR (or maybe it is easier than I think?).

But, keeping in mind that we could add parameters to control the behavior later, for this PR all we need is to add -Depth to ConvertFrom-Json with a default of 1024, of type Int, with a ValidateRange of 1 to Int32.MaxValue. I don't see a pressing need to change the behavior of ConvertTo-Json in the same PR, but if it was I would recommend changing the internal max depth from 100 to Int.MaxValue (Keeping Depth at a default of 2). That way we are safe from DoS by default, but let the user hold their fate in their own hands should they wish to try an absurdly deep object.

A bit of flavor text: A JSON object over 1024 in depth is an outlier. Other than abused/broken APIs or contrived examples, I have rarely seen a legit JSON objet that is deeper than ~30. On the flip-side, infinitely deep objects are very commonly thrown at ConvertTo-Json. (Based on my experience, YMMV.)

@iSazonov

This comment has been minimized.

Copy link
Collaborator

iSazonov commented Nov 19, 2018

I don't see a pressing need to change the behavior of ConvertTo-Json in the same PR, but if it was I would recommend

I think it is better to make both changes for Depth in the PR to get clean history and continue with rest ideas in follow PRs.

@mklement0

This comment has been minimized.

Copy link
Contributor

mklement0 commented Nov 26, 2018

Thanks for the analysis, @markekraus, but I still think -Depth is enough, for both ConvertTo-Json and ConvertFrom-Json:

  • In the absence of -Depth, exceeding the documented, hard-coded max. depth will result in an error.

  • Use of -Depth never results in an error, and is used intentionally to:

    • truncate the object tree at the specified level (whether or not that level is below or above the hard-coded max. depth)
    • opt-into object trees that are deeper than the hard-coded max. depth

To put it differently: the hard-coded max. depth is then no longer a default value for -Depth, it is a built-in safety mechanism that can be overridden with -Depth, if needed (which, with reasonably high defaults, will rarely be necessary).

If you have a "runaway" tree of infinite depth, then the hard-coded max. will save you.
If you actually do need more depth, you can opt in with -Depth.

@SteveL-MSFT

This comment has been minimized.

Copy link
Member

SteveL-MSFT commented Nov 28, 2018

@PowerShell/powershell-committee appreciates the discussion and we still suggest that we have the single -Depth defaulting to 1024 on ConvertFrom-Json to preserve existing behavior and not have any changes to erroring out vs truncation as it seems that there is no practical reason to force symmetry between the two while incurring a breaking change.

@louistio louistio force-pushed the louistio:convertfrom-json-max-depth branch from 260df49 to 3f6391d Jan 9, 2019
Copy link
Contributor Author

louistio left a comment

Rebased and changed according to comments from the committee. I took the liberty of having the option to set -Depth to 0 to mean no maximum depth, as I think it's a good option to provide users and the committee did not say anything against it.

@@ -50,6 +50,23 @@ public static object ConvertFromJson(string input, out ErrorRecord error)
/// if the <paramref name="returnHashtable"/> parameter is true.</returns>
[SuppressMessage("Microsoft.Naming", "CA1704:IdentifiersShouldBeSpelledCorrectly")]
public static object ConvertFromJson(string input, bool returnHashtable, out ErrorRecord error)
{
return ConvertFromJson(input, returnHashtable, maxDepth: 1024, out error);

This comment has been minimized.

Copy link
@louistio

louistio Jan 9, 2019

Author Contributor

Wondering if a custom type SerializerSettings to encapsulate default 1024 maxDepth and false returnHashtable here would be better than adding overloads.

This comment has been minimized.

Copy link
@SteveL-MSFT

SteveL-MSFT Jan 10, 2019

Member

if there are other settings we would want in the future, then that would be a better approach

This comment has been minimized.

Copy link
@iRon7

iRon7 Jan 10, 2019

I would consider the option to set -Depth less than 0 for no maximum depth.
0 or $Null are often the result of a wrong calculation. Besides, 0 could technically still be a desired "depth".

This comment has been minimized.

Copy link
@louistio

louistio Jan 10, 2019

Author Contributor

@iRon7

Besides, 0 could technically still be a desired "depth".

How so? Wouldn't that mean you want the deserialization to always fail? Is that really a use case? Also, the underlying Newtonsoft.Json does not allow 0 or less as MaxDepth, it throws at runtime.

This comment has been minimized.

Copy link
@iRon7

iRon7 Jan 10, 2019

For a sample recurring hash table like:

$ht = @{
	Name = "Tree"
	Parent = @{
		Name = "Parent"
		Child = @{
			Name = "Child"
		}
	}
}
$ht.Parent.Child.Parent = $ht.Parent

A $ht | ConvertTo-Json -Depth 1 results in:

{
    "Parent":  {
                   "Child":  "System.Collections.Hashtable",
                   "Name":  "Parent"
               },
    "Name":  "Tree"
}

I was expecting for $ht | ConvertTo-Json -Depth 0 to result in:

{
    "Parent":  "System.Collections.Hashtable",
    "Name":  "Tree"
}

But I indeed get an error:

ConvertTo-Json : Cannot validate argument on parameter 'Depth'. The 0 argument is less than the minimum allowed range
of 1. Supply an argument that is greater than or equal to 1 and then try the command again.
At line:1 char:29

  • $ht | ConvertTo-Json -Depth 0
  •                         ~
    
    • CategoryInfo : InvalidData: (:) [ConvertTo-Json], ParameterBindingValidationException
    • FullyQualifiedErrorId : ParameterArgumentValidationError,Microsoft.PowerShell.Commands.ConvertToJsonCommand

Anyways, wouldn't it be better to get either a completely flat output or an error in case of a (incorrect calculated/unassigned/mistyped variable) Depth of 0 or $Null?
I think a negative Depth would just be more knowingly chosen.
In other words, I do not really expect it to be a desired used case, but in the case of a scripting failure, I think a flat result or an error is more clear then a possible performance issue.

Just a thought, the decision is up to you...

This comment has been minimized.

Copy link
@markekraus

markekraus Jan 10, 2019

Collaborator

I'm still of the opinion that unlimited is not needed. 1 to int32.MaxValue is sufficient. Any JSON over Int32.MaxValue is either broken or contrived.

This comment has been minimized.

Copy link
@SteveL-MSFT

SteveL-MSFT Jan 11, 2019

Member

In the real world, I expect the default of 1024 to be sufficient, so I would be ok with not needing unlimited. -Depth 0 for only the top level makes sense and consistent with Get-ChildItem -Depth 0

This comment has been minimized.

Copy link
@louistio

louistio Jan 18, 2019

Author Contributor

@iRon7 after thinking it through, I agree a negative value might be better than 0.

@markekraus I understand powershell needs to be secure by default but I think this is an opportunity to provide the option for people to just say "I know what I'm doing, get out of my way". Sure, you could have them type -Depth:([int]::MaxValue), but feels less convenient in my opinion and it would still have the overhead (however small) of the serializer repeatedly checking to make sure max depth isn't exceeded (not the case if you pass null).

What do you guys think of having -Depth -1 for unlimited and maybe throw an error on BeginProcessing() if -Depth is 0?

This comment has been minimized.

Copy link
@markekraus

markekraus Jan 18, 2019

Collaborator

It's not about security.. it's about absurdity. The only JSON objects that are that deep are contrived (made to be deep for fun or to specifically test depth limits) or broken (whatever produced the JSON had some issue). 1024 is already absurdly deep for a JSON object. We all ready catch outliers with the default setting. Between 1024 and Int32.MaxValue is a the realm of intrigue only. We don't need to have special case logic for that. We don't even really need infinite depth. What is the target audience for such a feature? Who really has objects that deep?

Just because we can, doesn't mean we should I still stand by my statement that 1 to int32.MaxValue is sufficient.

@louistio louistio changed the title Deserialize objects with greather depth than 1024 in ConvertFrom-Json Add configurable maximum depth in ConvertFrom-Json with `-Depth` Jan 10, 2019
@iSazonov

This comment has been minimized.

Copy link
Collaborator

iSazonov commented Jan 10, 2019

@markekraus @mklement0 @SteveL-MSFT Please update your review.

@louistio Please fix StyleCop issues.

MetadataPropertyHandling = MetadataPropertyHandling.Ignore
};

if (maxDepth != null)

This comment has been minimized.

Copy link
@SteveL-MSFT

SteveL-MSFT Jan 10, 2019

Member

.MaxDepth == null means no limit, so it doesn't seem this check is necessary

@{ Depth = 2; AsHashtable = $false }
@{ Depth = 200; AsHashtable = $true }
@{ Depth = 200; AsHashtable = $false }
@{ Depth = 2000; AsHashtable = $true }

This comment has been minimized.

Copy link
@SteveL-MSFT

SteveL-MSFT Jan 10, 2019

Member

How long does this take to run? If longer than 2 secs, we should have them as Feature tests instead of CI.


It 'Fails to convert an object of depth higher than 1024 by default with AsHashtable switch set to <AsHashtable>' -TestCases $testCasesWithAndWithoutAsHashtableSwitch {
Param($AsHashtable)
$nestedJson = GenerateNestedJson -Depth:1989

This comment has been minimized.

Copy link
@SteveL-MSFT

SteveL-MSFT Jan 10, 2019

Member

wouldn't 1025 be sufficient? no need to spend extra cpu here

@{ Depth = 2000; AsHashtable = $false }
)

function GenerateNestedJson {

This comment has been minimized.

Copy link
@SteveL-MSFT

SteveL-MSFT Jan 10, 2019

Member

prefer New-NestedJson

louistio added 3 commits Jan 18, 2019
…max-depth
function Count-ObjectDepth {
Param([PSCustomObject] $InputObject)

for ($i=1; $i -le 100000; $i++)

This comment has been minimized.

Copy link
@SteveL-MSFT

SteveL-MSFT Feb 2, 2019

Member

Since you have a max depth for checking here, you should probably enforce this limit on new-nestedjson. Also, I'm not sure we need to test this deep, perhaps 2048 is sufficient

@iSazonov iSazonov self-assigned this Feb 2, 2019
Co-Authored-By: louistio <adamgauthier12@gmail.com>
@louistio

This comment has been minimized.

Copy link
Contributor Author

louistio commented Feb 9, 2019

@iSazonov Hey! Sorry, I wasn't aware of the colon convention 😅 I applied your suggestions through GitHub's integrated tool, but there seems to be error when running the tests without the colons! I get errors like:

ConvertFrom-Json : The input object cannot be bound to any parameters for the command either because the command does not take pipeline input or the input and its properties do not match any of the parameters that take pipeline input.

I'm not super familiar as to why removing colons would cause this, could you point me into the right direction? Thanks!

@iSazonov

This comment has been minimized.

Copy link
Collaborator

iSazonov commented Feb 9, 2019

@louistio Sorry, colons is needed for asHashtable but not for Depth.

@iSazonov

This comment has been minimized.

Copy link
Collaborator

iSazonov commented Feb 10, 2019

@SteveL-MSFT @markekraus Please update your review.

Copy link
Collaborator

markekraus left a comment

Minor cleanup requested.

Co-Authored-By: louistio <adamgauthier12@gmail.com>
@iSazonov

This comment has been minimized.

Copy link
Collaborator

iSazonov commented Feb 11, 2019

@SteveL-MSFT Please update your review.

@iSazonov

This comment has been minimized.

Copy link
Collaborator

iSazonov commented Feb 20, 2019

@louistio Please update the PR description and we'll merge.

@louistio

This comment has been minimized.

Copy link
Contributor Author

louistio commented Feb 20, 2019

@iSazonov description updated, I believe this still requires documentation changes, not sure if that prevents from merging.

@iSazonov

This comment has been minimized.

Copy link
Collaborator

iSazonov commented Feb 20, 2019

@louistio Please open new Issue in PowerShell-Docs repo and add a reference to the PR description.

@louistio louistio mentioned this pull request Feb 20, 2019
2 of 8 tasks complete
@louistio

This comment has been minimized.

Copy link
Contributor Author

louistio commented Feb 20, 2019

@iSazonov done!

@iSazonov iSazonov merged commit caf8ac6 into PowerShell:master Feb 20, 2019
8 of 9 checks passed
8 of 9 checks passed
Codacy/PR Quality Review Not up to standards. This pull request quality could be better.
Details
CodeFactor 29 issues fixed. 2 issues found.
Details
PowerShell-CI-install-ps #PR-8199-20190211.01 succeeded
Details
PowerShell-CI-linux #PR-8199-20190211.01 succeeded
Details
PowerShell-CI-macos #PR-8199-20190211.01 succeeded
Details
PowerShell-CI-static-analysis #PR-8199-20190211.01 succeeded
Details
PowerShell-CI-windows #PR-8199-20190211.01 succeeded
Details
WIP Ready for review
Details
license/cla All CLA requirements met.
Details
@iSazonov

This comment has been minimized.

Copy link
Collaborator

iSazonov commented Feb 20, 2019

@louistio Thanks for great work!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.