Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Suboptimal performance of Jetty in Spring Web Reactive compared to Tomcat or Undertow [SPR-14945] #19512

Closed
spring-projects-issues opened this issue Nov 24, 2016 · 3 comments
Assignees
Labels
in: web Issues in web modules (web, webmvc, webflux, websocket)

Comments

@spring-projects-issues
Copy link
Collaborator

spring-projects-issues commented Nov 24, 2016

Daniel Fernández opened SPR-14945 and commented

Scenario

This is the scenario:

  • Web application outputting a very large sequence of items produced by a Flux<Item>, serialized as JSON.

Observed Results

When using Spring Web Reactive in a Spring Boot 2.0.0 (snapshot) application and comparing execution times between Jetty and other server options like Tomcat or Undertow, Jetty is considerably slower than the other two, being this difference increased with the length of the sequence of items being returned by the Flux<Item> publisher being returned by the @Controller method.

Compare Jetty (intro being hit during curl execution to see the data transfer flow):

$ curl http://localhost:8081/items/10000 > out.jetty
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  400k    0  400k    0     0  73855      0 --:--:--  0:00:05 --:--:-- 76756
100  752k    0  752k    0     0  74798      0 --:--:--  0:00:10 --:--:-- 75892
100 1136k    0 1136k    0     0  76268      0 --:--:--  0:00:15 --:--:-- 79320
100 1520k    0 1520k    0     0  76949      0 --:--:--  0:00:20 --:--:-- 79032
100 1905k    0 1905k    0     0  77142      0 --:--:--  0:00:25 --:--:-- 77921
100 2289k    0 2289k    0     0  77345      0 --:--:--  0:00:30 --:--:-- 78372
100 2421k    0 2421k    0     0  77823      0 --:--:--  0:00:31 --:--:-- 82026

With Tomcat:

$ curl http://localhost:8084/items/10000 > out.tomcat
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 1959k    0 1959k    0     0   334k      0 --:--:--  0:00:05 --:--:--  340k
100 2421k    0 2421k    0     0   336k      0 --:--:--  0:00:07 --:--:--  340k

Or with Undertow:

$ curl http://localhost:8085/items/10000 > out.undertow
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 1353k    0 1353k    0     0   232k      0 --:--:--  0:00:05 --:--:--  249k
100 2421k    0 2421k    0     0   245k      0 --:--:--  0:00:09 --:--:--  265k

Also note that this happens both when returning JSON arrays and SSE (Server-Sent Events).

Example applications

Example applications: https://github.com/danielfernandez/test-spring-boot-reactive-netty-output

The above applications replicate the scenario using Spring Boot 2.0.0 apps with Jetty, Netty, RxNetty, Tomcat and Undertow. Note this application also tests other issues (specified in separate tickets).

Please have a look at the detailed test explanation at the linked repository's README


Affects: 5.0 M3

Reference URL: https://github.com/danielfernandez/test-spring-boot-reactive-netty-output

Issue Links:

@spring-projects-issues
Copy link
Collaborator Author

Sébastien Deleuze commented

With stacktrace mode disabled, I got the following results with Jetty slightly more performant than Tomcat:

curl http://localhost:8084/items/1000000 > out.tomcat
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  236M    0  236M    0     0  7047k      0 --:--:--  0:00:34 --:--:-- 7076k

curl http://localhost:8081/items/1000000 > out.jetty 
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  236M    0  236M    0     0  9829k      0 --:--:--  0:00:24 --:--:-- 9826k

Do you obtain the same results?

@spring-projects-issues
Copy link
Collaborator Author

Daniel Fernández commented

Yes, my results are similar now :)

$ curl http://localhost:8084/items/1000000 > out.tomcat
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  236M    0  236M    0     0  6315k      0 --:--:--  0:00:38 --:--:-- 6323k

$ curl http://localhost:8081/items/1000000 > out.jetty
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  236M    0  236M    0     0  8516k      0 --:--:--  0:00:28 --:--:-- 8733k

@spring-projects-issues
Copy link
Collaborator Author

Sébastien Deleuze commented

Ok then I resolve this issue as cannot reproduce. In any case, thanks for this awesome and very complete benchmark project!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
in: web Issues in web modules (web, webmvc, webflux, websocket)
Projects
None yet
Development

No branches or pull requests

2 participants