Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix connection leak in rest proxy when return type is void or mono<void> #30072

Merged
merged 2 commits into from
Jul 21, 2022

Conversation

kasobol-msft
Copy link
Contributor

Caught in perf pipelines.

Rest Proxy would leak connections if interface method returns void or Mono like this

@Put("BinaryData/{id}")
Mono<Void> setBinaryData(@HostParam("$host") String endpoint,
@PathParam("id") String id,
@BodyParam("application/octet-stream") BinaryData body,
@HeaderParam("Content-Length") long length);

The symptoms were perf of OkHttp client grossly underperforming in core perf pipeline. Attempt to reproduce revealed OkHttp complaining about undisposed responses.

@azure-sdk
Copy link
Collaborator

API change check

API changes are not detected in this pull request.

return service.setBinaryData(endpoint, id, binaryDataSupplier.get(), length)
.then();
return Mono.fromSupplier(binaryDataSupplier)
.flatMap(data -> service.setBinaryData(endpoint, id, data, length));
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just curious, why is the new code better?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The binaryDataSupplier is supposed to supply fresh BinaryData for each runAsync.
The old code is effectively doing that, but only because the caller uses flatMap - which is just luck rather than proper reactive stream.
Wrapping the supplier in Mono makes it properly reactive, i.e. subscriber triggers binary data creation regardless of what caller of this is doing.

@@ -206,7 +206,7 @@ private Object handleRestReturnType(final Mono<HttpResponseDecoder.HttpDecodedRe
final Type monoTypeParam = TypeUtil.getTypeArgument(returnType);
if (TypeUtil.isTypeOrSubTypeOf(monoTypeParam, Void.class)) {
// ProxyMethod ReturnType: Mono<Void>
result = asyncExpectedResponse.then();
result = asyncExpectedResponse.doOnNext(HttpResponseDecoder.HttpDecodedResponse::close).then();
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this be doOnNext or doOnSuccess? Does Mono<Void> emit a next or does it only emit completion?

This applies to the void scenario below as well.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

asyncExpectedResponse is Mono<Mono<HttpResponseDecoder.HttpDecodedResponse> when we add that operator.
doOnSuccess would be kind of same here, but we'd need to null check in the consumer.
see
https://projectreactor.io/docs/core/release/api/reactor/core/publisher/Mono.html#doOnNext-java.util.function.Consumer-

https://projectreactor.io/docs/core/release/api/reactor/core/publisher/Mono.html#doOnSuccess-java.util.function.Consumer-

@kasobol-msft kasobol-msft merged commit 92b7424 into Azure:main Jul 21, 2022
@kasobol-msft kasobol-msft deleted the close-response-on-void branch July 21, 2022 13:33
@kasobol-msft
Copy link
Contributor Author

Test Arguments 1.30.0 source %Change
jsonsend --size 10 --parallel 64 --backend-type blobs --warmup 15 --duration 15 10809.14 13017.89 20.43%
jsonsend --size 1000 --parallel 64 --backend-type blobs --warmup 15 --duration 15 13067.39 12907.40 -1.22%
jsonreceive --size 10 --parallel 64 --backend-type blobs --warmup 15 --duration 15 25888.90 25900.12 0.04%
jsonreceive --size 1000 --parallel 64 --backend-type blobs --warmup 15 --duration 15 24803.62 24613.67 -0.77%
xmlsend --size 10 --parallel 64 --backend-type blobs --warmup 15 --duration 15 12979.08 13008.76 0.23%
xmlsend --size 1000 --parallel 64 --backend-type blobs --warmup 15 --duration 15 12987.90 12897.35 -0.70%
xmlreceive --size 10 --parallel 64 --backend-type blobs --warmup 15 --duration 15 25610.82 25652.50 0.16%
xmlreceive --size 1000 --parallel 64 --backend-type blobs --warmup 15 --duration 15 22679.80 22560.85 -0.52%
binarydatasend --size 10240 --binary-data-source bytes --parallel 16 --backend-type blobs --warmup 15 --duration 15 3511.92 3483.51 -0.81%
binarydatasend --size 10240 --binary-data-source file --parallel 16 --backend-type blobs --warmup 15 --duration 15 3475.10 3467.56 -0.22%
binarydatasend --size 10240 --binary-data-source flux --parallel 16 --backend-type blobs --warmup 15 --duration 15 3535.32 3491.02 -1.25%
binarydatasend --size 10240 --binary-data-source stream --parallel 16 --backend-type blobs --warmup 15 --duration 15 3459.40 3457.61 -0.05%
binarydatasend --size 10485760 --binary-data-source bytes --parallel 8 --backend-type blobs --warmup 15 --duration 15 81.08 80.11 -1.20%
binarydatasend --size 10485760 --binary-data-source file --parallel 8 --backend-type blobs --warmup 15 --duration 15 105.30 104.76 -0.51%
binarydatasend --size 10485760 --binary-data-source flux --parallel 8 --backend-type blobs --warmup 15 --duration 15 85.41 84.63 -0.91%
binarydatasend --size 10485760 --binary-data-source stream --parallel 8 --backend-type blobs --warmup 15 --duration 15 107.54 107.99 0.42%
binarydatasend --size 10240 --binary-data-source bytes --parallel 16 --backend-type blobs --http-client okhttp --warmup 15 --duration 15 323.99 3497.93 979.63%
binarydatasend --size 10240 --binary-data-source file --parallel 16 --backend-type blobs --http-client okhttp --warmup 15 --duration 15 317.47 3476.57 995.09%
binarydatasend --size 10240 --binary-data-source flux --parallel 16 --backend-type blobs --http-client okhttp --warmup 15 --duration 15 317.97 3457.69 987.43%
binarydatasend --size 10240 --binary-data-source stream --parallel 16 --backend-type blobs --http-client okhttp --warmup 15 --duration 15 320.68 3484.87 986.70%
binarydatasend --size 10485760 --binary-data-source bytes --parallel 8 --backend-type blobs --http-client okhttp --warmup 15 --duration 15 37.11 65.96 77.74%
binarydatasend --size 10485760 --binary-data-source file --parallel 8 --backend-type blobs --http-client okhttp --warmup 15 --duration 15 40.04 64.60 61.34%
binarydatasend --size 10485760 --binary-data-source flux --parallel 8 --backend-type blobs --http-client okhttp --warmup 15 --duration 15 39.07 65.14 66.75%
binarydatasend --size 10485760 --binary-data-source stream --parallel 8 --backend-type blobs --http-client okhttp --warmup 15 --duration 15 39.14 69.91 78.64%
binarydatareceive --size 10240 --parallel 16 --backend-type blobs --warmup 15 --duration 15 6637.52 6596.31 -0.62%
binarydatareceive --size 10485760 --parallel 8 --backend-type blobs --warmup 15 --duration 15 105.87 107.03 1.10%
binarydatareceive --size 10240 --parallel 16 --backend-type blobs --http-client okhttp --warmup 15 --duration 15 6462.88 6479.88 0.26%
binarydatareceive --size 10485760 --parallel 8 --backend-type blobs --http-client okhttp --warmup 15 --duration 15 39.48 38.94 -1.35%

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Azure.Core azure-core
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants