-
-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Investigate possibilities for improving start-up performance #3466
Comments
@dev-chirag a good option is switching to Scala and using jsoniter-scala. It does reflection in compile time only and is more efficient in runtime. Also it can be compiled to GraalVM native image to exclude impact of JIT compiler at all. |
@plokhotnyuk Switching to Scala is typically not a good idea for performance issues. :-p |
@dev-chirag You can certainly "warm up" things by eagerly calling serialize/deserialize on types that are likely used, as long as |
One note on class loading: class loading itself would be done by JVM; Jackson does not really do any of that (with the exception of 3 modules: Afterburner and Blackbird generate optimized handlers, and Mr Bean can materialize interface/abstract class implementations). |
We recently ran into this, or a very similar issue with our spring boot app. After the service starts up and a loadbalancer marks it healthy, it starts receiving quite a bit of traffic from the clients. In my local testing this seems cause issues in the cpu constrained environment (e.g. 10 parallel nodes, 1vcpu each) when parallel threads start making api calls to services using the same POJO types, causing severe lock contention in I tested creating a simple warmup sequence for spring beans that use http services and complex classes as bodies, e.g.:
This seems to basically remove the issue as the deserializer caches are done before accepting any traffic. Question is, which parts are actually necessary? Do I need to Also is |
I'd have to double-check this, but I think that for deserialization, cache pre-warming should work well for transitive dependencies, there not being much difference between root-level values and branch/leaf level. For serialization there is bit more difference due to more dynamic nature of handling. Still, there's probably most value in doing
On doing I hope this helps. |
Definitely helps, thanks. Will the deserialization cache be populated with types of members of the root type if the value to deserialize is e.g.
should both be cached via |
If you use debugger, you can check it out yourself via |
@ptorkko For deserializers, yes, value deserializers are fetched eagerly and not on-demand. For serializers fetching is dynamic unless type is |
I have a Microservice that contains a Jersey REST service and that accepts JSON body. I use jersey's Jackson support to deserialize the request into POJOs. This is a complex JSON (sometime upto 250kbs).
This microservice runs behind Envoy loadbalancer on a K8 cluster. So I can easily scale the number of instances.
Now the problem is, when I scale my application and Envoy starts distributing load (round-robin), my new instance has a really slow startup. For this new instance, it's like a burst of requests before JVM and service warmup. And during this time, I see that all my requests on this instance fail with a 503.
Since this new instance experienced a burst without Jackson and Jersey warmup, all threads are busy classloading. I can see this in my CPU profiling that almost 100% CPU is consumed by C1 and C2 compiler threads and loaded classes count shoots up.
My assumption here is that, all the requests are executed in parallel and all threads handling them are waiting for Jackson to load the POJO classes. Once the classloading is completed, the classes are cached, freeing up CPU and allowing threads to process the request instead.
I understand this from #1970 that this could be the problem. Is my assumption correct? Is there any thing I can do to load these classes before the first request?
Any leads would be much appreciated. Thanks
The text was updated successfully, but these errors were encountered: