You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Sub-quadratic decreasing of throughput when length of the JSON object is increasing
On contemporary CPUs parsing of such JSON object with an additional field that has of 1000000 decimal digits (~1Mb) can took ~15 seconds for the ScalikeJackson parser:
[info] REMEMBER: The numbers below are just data. To gain reusable insights, you need to follow up on
[info] why the numbers are the way they are. Use profilers (see -prof, -lprof), design factorial
[info] experiments, perform baseline and negative tests that provide experimental control, make sure
[info] the benchmarking environment is safe on JVM/OS/HW level, ask for reviews from the domain experts.
[info] Do not assume the numbers tell you what you want them to tell.
[info] Benchmark (size) Mode Cnt Score Error Units
[info] ExtractFieldsReading.scalikeJackson 1 thrpt 2 1432398.714 ops/s
[info] ExtractFieldsReading.scalikeJackson 10 thrpt 2 1377622.290 ops/s
[info] ExtractFieldsReading.scalikeJackson 100 thrpt 2 684892.763 ops/s
[info] ExtractFieldsReading.scalikeJackson 1000 thrpt 2 50046.000 ops/s
[info] ExtractFieldsReading.scalikeJackson 10000 thrpt 2 660.185 ops/s
[info] ExtractFieldsReading.scalikeJackson 100000 thrpt 2 6.947 ops/s
[info] ExtractFieldsReading.scalikeJackson 1000000 thrpt 2 0.067 ops/s
BTW, Play-JSON hasn't such kind of vulnerability. It throws the following error for too big numbers:
[info] java.lang.IllegalArgumentException: Number is larger than supported for field "x"
[info] at play.api.libs.json.jackson.JsValueDeserializer.parseBigDecimal(JacksonJson.scala:142)
[info] at play.api.libs.json.jackson.JsValueDeserializer.deserialize(JacksonJson.scala:164)
[info] at play.api.libs.json.jackson.JsValueDeserializer.deserialize(JacksonJson.scala:126)
[info] at play.api.libs.json.jackson.JsValueDeserializer.deserialize(JacksonJson.scala:121)
[info] at com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:3984)
[info] at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2276)
[info] at play.api.libs.json.jackson.JacksonJson$.parseJsValue(JacksonJson.scala:252)
[info] at play.api.libs.json.StaticBinding$.parseJsValue(StaticBinding.scala:12)
[info] at play.api.libs.json.Json$.parse(Json.scala:173)
The text was updated successfully, but these errors were encountered:
Sub-quadratic decreasing of throughput when length of the JSON object is increasing
On contemporary CPUs parsing of such JSON object with an additional field that has of 1000000 decimal digits (~1Mb) can took ~15 seconds for the ScalikeJackson parser:
Steps to reproduce
To run that benchmarks on your JDK:
sbt
and/or ensure that it already installed properly:jsoniter-scala
repo:BTW, Play-JSON hasn't such kind of vulnerability. It throws the following error for too big numbers:
The text was updated successfully, but these errors were encountered: