Skip to content

jetty reactive client bottle neck for spring reactive flapmap operator #228

Open
@patpatpat123

Description

@patpatpat123

What I am trying to achieve:

Send as many http requests as possible, in parallel, to a very reliable third party service from an aggressive Flux

Background:

The third party service is very reliable and can sustain a very high number of requests. So far, I am the only client of this third party service. I am invited to hit the third party server as hard as possible. The third party service, which I have no control over, does not offer any bulk/list API. I can only send requests one by one. On their side, each request takes a constant one second to process.

This third party API has keep-alive enabled. But does not support GRPC, http2, socket. There is no rate limit at any point of the end to end flow.

What did I try:

Here is the configuration of my http client, as well as the logic to send the http requests (hopefully as many requests, as fast as possible)

client

    public WebClient webClient(final WebClient.Builder builder) {
        final var clientConnector = new ClientConnector();
        final var httpClient = new org.eclipse.jetty.client.HttpClient(new HttpClientTransportDynamic(clientConnector));
        httpClient.setMaxRequestsQueuedPerDestination(9999);
        httpClient.setMaxConnectionsPerDestination(9999);
        return builder.baseUrl(hostAndPort).defaultHeader(HttpHeaders.CONTENT_TYPE, MediaType.APPLICATION_JSON_VALUE).clientConnector(new JettyClientHttpConnector(httpClient)).build();
    }

the flux:

// some very fast flux, this line is just an example
        Flux<String> inputFlux = Flux.interval(Duration.ofMillis(1)).map(i -> "transform to request payload number " + i);
        //send as many requests as fast as possible, note the 4096
        Flux<String> resultFlux = inputFlux.flatMap(oneInput -> webClient.post().bodyValue(oneInput).retrieve().bodyToMono(String.class), 4096);
        //doing something with the result
        return resultFlux.map(oneResult -> doSomething(oneResult));

Using this, I asked the third party service and they gave me a number N, my number of requests per second.

First observation, the number of concurrency for flatmap here is 4096. And since the third party takes one second to process the request, I would have expected a rate N of 4096 requests per second.

However, I am nowhere close. The third party service told me I am at 16ish requests per second.

Issue:

I believe the underlying limitation comes from jetty http client.
I interchanged the webclient (which uses jetty) with a dummy operation, and could see much higher throughput.

I believe the issue here is that the scheduling policy of the jetty-reactive-httpclient library is limiting the throughput.
What parameters, the number of concurrent connections, possibly IO threads, keep alive, I should use in order to "unleash" jetty reactive http client?

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions