Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

encoding/json: provide a way to limit recursion depth #31789

Open
mgritter opened this issue May 1, 2019 · 7 comments

Comments

@mgritter
Copy link

commented May 1, 2019

What version of Go are you using (go version)?

$ go version
go version go1.12.4 linux/amd64

What did you do?

A JSON string of about 17MB, constructed by the linked code snippet, is enough to cause a stack overflow with the default 1GB stack limit. A different construction can produce this behavior using only about a 5MB string.

https://play.golang.org/p/MLn5clWEIOl

What did you expect to see?

Applications that use encoding/json to parse API calls can limit their maximum request size to avoid this resource exhaustion attack. But, this limit may be too small for some applications, and the amount of stack used can still be a problem even if the application does not crash. It would be desirable if the module had some way of enforcing a maximum array or object nesting depth during parsing, so that applications using encoding/json can continue to accept large API
calls.

I'm mainly familiar with C++ implementations that do that, for example ripple:
https://github.com/ripple/rippled/blob/5214b3c1b09749420ed0682a2e49fbbd759741e5/src/ripple/protocol/impl/STParsedJSON.cpp#L730

Or bitshares:
https://github.com/bitshares/bitshares-fc/blob/8ebd99b786623bc8d55e89d42df82644a71a6885/src/io/json.cpp#L415

This C++ library for JSON does not (explicitly, although it notes the RFC allows limiting the nesting depth) but it does provide a callback by which the depth could be checked and probably used to abort parsing:
https://github.com/nlohmann/json/blob/ee4028b8e4cf6a85002a296a461534e2380f0d57/include/nlohmann/json.hpp#L1078

This Rust implementation does limit the nesting depth but has a flag to disable such checking:
https://github.com/serde-rs/json/blob/5b5f95831d9e0d769367b30b76a686339bffd209/src/de.rs#L202

IBM's DataPower Gateways documents nesting limits in their implementation:
https://www.ibm.com/support/knowledgecenter/en/SS9H2Y_7.6.0/com.ibm.dp.doc/json_parserlimits.html

@bradfitz bradfitz added this to the Go1.14 milestone May 1, 2019

@bradfitz bradfitz added the NeedsFix label May 1, 2019

@bradfitz

This comment has been minimized.

Copy link
Member

commented May 1, 2019

Sounds like a reasonable json.Decoder option. We should also set the default to something reasonable.

@mvdan

This comment has been minimized.

Copy link
Member

commented May 3, 2019

I agree that this is a good idea; encoding/gob has similar checks to avoid runtime panics.

@jamdagni86

This comment has been minimized.

Copy link
Contributor

commented May 10, 2019

is anyone working on this? If not, I'd like to take this up.

@mvdan

This comment has been minimized.

Copy link
Member

commented May 10, 2019

I believe @joshblakeley intends to work on it soon. There's no rush, as we're in the middle of a freeze right now.

@gopherbot

This comment has been minimized.

Copy link

commented Aug 18, 2019

Change https://golang.org/cl/190658 mentions this issue: encoding/json: added recursion depth limit

@dsnet

This comment has been minimized.

Copy link
Member

commented Aug 19, 2019

I have at least two concerns this:

  • This issue is part of a more generalized problem where untrusted input can caused resource exhaustion, and stack overflow is just a sub-set of all possible resource exhaustion problem (could be heap memory usage or causing the CPU to spin). If there is any API change done to address this, I feel that there should be a consistent approach to solve this as the nature of the problem is not specific to encoding/json. See #20169 and #33036.
  • A decoder option does not interact well with the existing feature of encoding/json where it calls UnmarshalJSON if an object supports the Unmarshaler interface. However, this interface provides no mechanism for plumbing down top-level options down to custom unmarshalers.
@ekalinin

This comment has been minimized.

Copy link
Contributor

commented Aug 21, 2019

@dsnet

Thanks for comments!

This issue is part of a more generalized problem where untrusted input can caused resource exhaustion, and stack overflow is just a sub-set of all possible resource exhaustion problem (could be heap memory usage or causing the CPU to spin). If there is any API change done to address this, I feel that there should be a consistent approach to solve this as the nature of the problem is not specific to encoding/json. See #20169 and #33036.

I like your idea with ioquota and reader wrapper. But it's still not clear how we can use it if we want to restrict not only the size of the memory.

Besides that, i see #33036 has still tags Proposal-Hold & NeedsDecision. And there's no approved way for solving that kind of issues, right?

A decoder option does not interact well with the existing feature of encoding/json where it calls UnmarshalJSON if an object supports the Unmarshaler interface. However, this interface provides no mechanism for plumbing down top-level options down to custom unmarshalers.

Are you suggesting just do not change current behavior of json.Unmarshal API?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
7 participants
You can’t perform that action at this time.