Skip to content

Conversation

@jeesou
Copy link
Contributor

@jeesou jeesou commented Dec 3, 2025

Impact :

With older versions of jackson-core, if you parse an input file and it has deeply nested data, Jackson could end up throwing a StackoverflowError if the depth is particularly large.

Patches :

jackson-core 2.15.0 contains a configurable limit for how deep Jackson will traverse in an input document, defaulting to an allowable depth of 1000. Change is in FasterXML/jackson-core#943. jackson-core will throw a StreamConstraintsException if the limit is reached.
jackson-databind also benefits from this change because it uses jackson-core to parse JSON inputs.

force "com.fasterxml.jackson.module:jackson-module-scala_${scalaVersion}:${libs.versions.jackson214.get()}"
force "com.fasterxml.jackson.core:jackson-databind:${libs.versions.jackson214.get()}"
force "com.fasterxml.jackson.core:jackson-core:${libs.versions.jackson214.get()}"
force "com.fasterxml.jackson.module:jackson-module-scala_${scalaVersion}:${libs.versions.jackson215.get()}"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Spark 3.4 depends on jackson 2.14 as can be seen in https://github.com/apache/spark/blob/branch-3.4/pom.xml#L187-L188. Switching to jackson 2.15 can have side-effects and cause issues. What about addressing the CVE in Spark 3.4 first?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Spark no longer maintains 3.4.x and we've already decided to deprecate Spark 3.4 support. We might remove it in the next release so I think we can just leave it as is.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants