Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix cryptic 'The supplier returned null' message if OIDC server connection fails #26661

Merged
merged 1 commit into from Jul 12, 2022
Merged

Fix cryptic 'The supplier returned null' message if OIDC server connection fails #26661

merged 1 commit into from Jul 12, 2022

Conversation

famod
Copy link
Member

@famod famod commented Jul 11, 2022

I was getting:

java.lang.NullPointerException: The supplier returned `null`
	at io.smallrye.mutiny.operators.uni.UniOnFailureTransform$UniOnFailureTransformProcessor.onFailure(UniOnFailureTransform.java:62)
	at io.smallrye.mutiny.operators.uni.UniRetryAtMost$UniRetryAtMostProcessor.onFailure(UniRetryAtMost.java:67)
	at io.smallrye.mutiny.vertx.AsyncResultUni.lambda$subscribe$1(AsyncResultUni.java:37)
	at io.vertx.mutiny.ext.web.client.HttpRequest$4.handle(HttpRequest.java:448)
	at io.vertx.mutiny.ext.web.client.HttpRequest$4.handle(HttpRequest.java:443)
	at io.vertx.ext.web.client.impl.HttpContext.handleFailure(HttpContext.java:396)
	at io.vertx.ext.web.client.impl.HttpContext.execute(HttpContext.java:390)
	at io.vertx.ext.web.client.impl.HttpContext.next(HttpContext.java:365)
	at io.vertx.ext.web.client.impl.HttpContext.fire(HttpContext.java:332)
	at io.vertx.ext.web.client.impl.HttpContext.fail(HttpContext.java:313)
	at io.vertx.ext.web.client.impl.HttpContext.lambda$handleCreateRequest$7(HttpContext.java:536)
	at io.vertx.core.impl.future.FutureImpl$3.onFailure(FutureImpl.java:153)
	at io.vertx.core.impl.future.FutureBase.emitFailure(FutureBase.java:75)
	at io.vertx.core.impl.future.FutureImpl.tryFail(FutureImpl.java:230)
	at io.vertx.core.impl.future.PromiseImpl.tryFail(PromiseImpl.java:23)
	at io.vertx.core.http.impl.HttpClientImpl.lambda$doRequest$8(HttpClientImpl.java:659)
	at io.vertx.core.net.impl.pool.Endpoint.lambda$getConnection$0(Endpoint.java:52)
	at io.vertx.core.http.impl.SharedClientHttpStreamEndpoint$Request.handle(SharedClientHttpStreamEndpoint.java:162)
	at io.vertx.core.http.impl.SharedClientHttpStreamEndpoint$Request.handle(SharedClientHttpStreamEndpoint.java:123)
	at io.vertx.core.impl.EventLoopContext.emit(EventLoopContext.java:50)
	at io.vertx.core.impl.ContextImpl.emit(ContextImpl.java:274)
	at io.vertx.core.impl.EventLoopContext.emit(EventLoopContext.java:22)
	at io.vertx.core.net.impl.pool.SimpleConnectionPool$ConnectFailed$1.run(SimpleConnectionPool.java:382)
	at io.vertx.core.net.impl.pool.CombinerExecutor.submit(CombinerExecutor.java:50)
	at io.vertx.core.net.impl.pool.SimpleConnectionPool.execute(SimpleConnectionPool.java:245)
	at io.vertx.core.net.impl.pool.SimpleConnectionPool.lambda$connect$2(SimpleConnectionPool.java:259)
	at io.vertx.core.http.impl.SharedClientHttpStreamEndpoint.lambda$connect$2(SharedClientHttpStreamEndpoint.java:104)
	at io.vertx.core.impl.future.FutureImpl$3.onFailure(FutureImpl.java:153)
	at io.vertx.core.impl.future.FutureBase.emitFailure(FutureBase.java:75)
	at io.vertx.core.impl.future.FutureImpl.tryFail(FutureImpl.java:230)
	at io.vertx.core.impl.future.Composition$1.onFailure(Composition.java:66)
	at io.vertx.core.impl.future.FutureBase.emitFailure(FutureBase.java:75)
	at io.vertx.core.impl.future.FailedFuture.addListener(FailedFuture.java:98)
	at io.vertx.core.impl.future.Composition.onFailure(Composition.java:55)
	at io.vertx.core.impl.future.FutureBase.emitFailure(FutureBase.java:75)
	at io.vertx.core.impl.future.FutureImpl.tryFail(FutureImpl.java:230)
	at io.vertx.core.impl.future.PromiseImpl.tryFail(PromiseImpl.java:23)
	at io.vertx.core.impl.EventLoopContext.emit(EventLoopContext.java:50)
	at io.vertx.core.impl.ContextImpl.emit(ContextImpl.java:274)
	at io.vertx.core.impl.EventLoopContext.emit(EventLoopContext.java:22)
	at io.vertx.core.net.impl.NetClientImpl.failed(NetClientImpl.java:294)
	at io.vertx.core.net.impl.NetClientImpl.lambda$connectInternal$4(NetClientImpl.java:262)
	at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:578)
	at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:552)
	at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:491)
	at io.netty.util.concurrent.DefaultPromise.setValue0(DefaultPromise.java:616)
	at io.netty.util.concurrent.DefaultPromise.setFailure0(DefaultPromise.java:609)
	at io.netty.util.concurrent.DefaultPromise.setFailure(DefaultPromise.java:109)
	at io.vertx.core.net.impl.ChannelProvider.lambda$handleConnect$0(ChannelProvider.java:159)
	at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:578)
	at io.netty.util.concurrent.DefaultPromise.notifyListeners0(DefaultPromise.java:571)
	at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:550)
	at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:491)
	at io.netty.util.concurrent.DefaultPromise.setValue0(DefaultPromise.java:616)
	at io.netty.util.concurrent.DefaultPromise.setFailure0(DefaultPromise.java:609)
	at io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:117)
	at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe$1.run(AbstractNioChannel.java:262)
	at io.netty.util.concurrent.PromiseTask.runTask(PromiseTask.java:98)
	at io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:170)
	at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164)
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:469)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:503)
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.base/java.lang.Thread.run(Thread.java:833)

Only in the debugger I was able to see that the actual cause was:

io.netty.channel.ConnectTimeoutException: connection timed out: auth-ref.vzd.ti-dienste.de/84.17.168.202:9443

@famod famod requested a review from sberyozkin July 11, 2022 14:52
@famod famod changed the title Fix cryptic 'The supplier returned ' message if OIDC server connection fails Fix cryptic 'The supplier returned null' message if OIDC server connection fails Jul 11, 2022
Copy link
Member

@sberyozkin sberyozkin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @famod

@sberyozkin sberyozkin added the triage/waiting-for-ci Ready to merge when CI successfully finishes label Jul 11, 2022
@sberyozkin
Copy link
Member

Looks like an exception is structured differently in case of the connection time out, as just recently we dealt with the issue where an OidcException happening in case of the OIDC server unavailability reporting too much info.

@famod
Copy link
Member Author

famod commented Jul 11, 2022

Well, with this change you will see the requested endpoint in the trace. Is that too much?
I'm not sure who the "receiver" is in that other case (meaning: why is that info criticial in that other case? Is it forwarded to some UI?).

@sberyozkin
Copy link
Member

@famod Hmm, I haven't thought about it, while the current message is clearly not good enough, it will also a problem in that other case, the internal endpoint address leaking to to the client, see #26528 and the linked issue.
So I guess if it is null then we should only log it but throw the same generic OIDCException("OIDc Server is not available") ?

@sberyozkin sberyozkin removed the triage/waiting-for-ci Ready to merge when CI successfully finishes label Jul 11, 2022
@famod
Copy link
Member Author

famod commented Jul 11, 2022

Hum, after reading that issue I can't help thinking that crippling exception messages feels like fixing it on the wrong end.
OTOH, I do understand the challenge of not having some general kind of fault barrier for oidc-client.

TBH, I don't like hiding such info via DEBUG and would rather like to see a WARN message. This shouldn't be any issue in terms of info leakage (somebody who can read your logs can probably also read your environment vars...).
If it has to be DEBUG then we should at least log something like "... Enable DEBUG logging for XYZ to see more details.".
Just my two cents.

@sberyozkin
Copy link
Member

@famod No, the problem is about leaking it to the external client, with this PR, will the calling client get the internal address of Keycloak ? May be I am missing something ?

@sberyozkin
Copy link
Member

@famod when OidcException is thrown, 500 is reported with OIDC Server is unavailable, but this exception happens at the OIDC server connection bootstrap time, either at the application start time time or during the first client request.
However, Connection Timed Out is different, looks like it happens after the connection has been successfully established initially, so in this case you may be seeing 401.
Can you clarify please a bit more what is reported in this case to the external client;

As I said I agree we need to fix this cryptic message, my only concern is whether it will leak the internal KC address or not to the external client, this is why I suggested to report OIDCException("OIDC Server is not available") and log the Netty exception,
we can do a warning in this case as it is not what is caused by the actual request, trace/debug is done for most of the request related exceptions to avoid the attackers causing some DOS due to the excessive logging

@famod
Copy link
Member Author

famod commented Jul 12, 2022

@sberyozkin

Can you clarify please a bit more what is reported in this case to the external client;

Well, my scenario is not like in #26314 (where the info leak seems to occur when an OIDC secured resource is involved).
Instead I have a "backend connector" that calls another system via REST and needs a token for that.
So we are using:

@RegisterRestClient(configKey = "my-connector")
@RegisterProvider(MyOidcClientRequestFilter.class)

(MyOidcClientRequestFilter is just extending OidcClientRequestFilter.clientId())
During that OIDC communication, because I had not set a proxy, I was getting that ConnectTimeoutException (shadowed by the cryptic mutiny exeception).
That connector is called from a REST resource (via a service) and so our ExceptionMapper kicks in and no leak occurs.

So, long story short: I cannot tell you how or whether at all this can happen if a REST resource is OIDC secured and cannot reach the respective OIDC server.
Does that make sense?

I guess it's safest to proceed as you suggested: throw a more general exception but also log all the details as a warning.

.onFailure().transform(t -> {
LOG.warn("OIDC Server is not available:", t.getCause() != null ? t.getCause() : t);
// don't wrap t to avoid information leak
return new OidcClientException("OIDC Server is not available");
Copy link
Member Author

@famod famod Jul 12, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I chose OidcClientException instead of OIDCException because this class already throws OidcClientException in multiple places, but not OIDCException.
Please let me know in case you prefer OIDCException.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@famod Ha-ha, I've only now realized you are fixing OidcClientImpl and not OidcRecorder :-), but your update is still important IMHO, I think it looks fine now, thanks

@famod
Copy link
Member Author

famod commented Jul 12, 2022

With the change now pushed, the original ConnectTimeoutException will automatically show up as a suppressed execption, but that shouldn't be anything to worry about, or is it?

@famod famod requested a review from sberyozkin July 12, 2022 21:43
@sberyozkin sberyozkin added the triage/waiting-for-ci Ready to merge when CI successfully finishes label Jul 12, 2022
@famod famod merged commit f547d57 into quarkusio:main Jul 12, 2022
@quarkus-bot quarkus-bot bot added this to the 2.11 - main milestone Jul 12, 2022
@quarkus-bot quarkus-bot bot removed the triage/waiting-for-ci Ready to merge when CI successfully finishes label Jul 12, 2022
@famod famod deleted the oidc-supplier-null branch July 12, 2022 23:11
@gsmet gsmet modified the milestones: 2.11.0.CR1, 2.10.3.Final Jul 18, 2022
@sag1
Copy link

sag1 commented Sep 2, 2022

Is it really fixed ?
I am using oidc-client and followed this guide for config. I am hitting head for many hours without knowing where is error and why NPE. I don't know why there is no explicit message.

Quarkus version: 2.11.1.Final
My code :
@path("/auth")
public class LoginResource {
@Inject
OidcClient oidcClient;

public Response ssoAuthentication(@QueryParam("code") String code) {
    Map<String, String> grantParams = new HashMap<>();
    grantParams.put(OidcConstants.CODE_FLOW_CODE, code);
    grantParams.put(OidcConstants.CODE_FLOW_REDIRECT_URI, redirectUriParam);
    Tokens tokens = oidcClient.getTokens(grantParams).await().indefinitely();
     Cookie cookieAccessToken = Cookie.cookie("accesstoken", accessToken).setSameSite(CookieSameSite.STRICT).setMaxAge(tokens.getAccessTokenExpiresAt());
     return Response.status(RestResponse.Status.FOUND).location(uri)
            .header(HttpHeaders.SET_COOKIE, cookieAccessToken.encode())

}
}

My application.properties:
quarkus.oidc-client.auth-server-url=https://auth.tech.orange
quarkus.oidc-client.client-id=aaaaa-bbbbbb-cccc-dddd-fffff
quarkus.oidc-client.credentials.client-secret.value=zsdsfdazerty
quarkus.oidc-client.credentials.client-secret.method=post
quarkus.oidc-client.grant.type=code

Error:
{"timestamp":"2022-09-02T15:22:14.35Z","sequence":99,"loggerClassName":"org.jboss.logging.Logger","loggerName":"io.quarkus.runtime.Application","level":"ERROR","message":"Failed to start application (with profile prod)","threadName":"main","threadId":1,"mdc":{},"ndc":"","hostName":"powo-backend-staging-7c8f5f7787-vf7xb","processName":"quarkus-run.jar","processId":1,"stackTrace":"java.lang.NullPointerException: The supplier returned null\n\tat io.smallrye.mutiny.operators.uni.UniOnFailureTransform$UniOnFailureTransformProcessor.onFailure(UniOnFailureTransform.java:62)\n\tat io.smallrye.mutiny.operators.uni.builders.UniCreateFromPublisher$PublisherSubscriber.onError(UniCreateFromPublisher.java:81)\n\tat io.smallrye.mutiny.helpers.HalfSerializer.onError(HalfSerializer.java:56)\n\tat io.smallrye.mutiny.helpers.StrictMultiSubscriber.onFailure(StrictMultiSubscriber.java:91)\n\tat io.smallrye.mutiny.subscription.MultiSubscriber.onError(MultiSubscriber.java:73)\n\tat io.smallrye.mutiny.subscription.SerializedSubscriber.onFailure(SerializedSubscriber.java:102)\n\tat io.smallrye.mutiny.operators.multi.MultiRetryWhenOp$RetryWhenOperator.testOnFailurePredicate(MultiRetryWhenOp.java:136)\n\tat io.smallrye.mutiny.operators.multi.MultiRetryWhenOp$RetryWhenOperator.onFailure(MultiRetryWhenOp.java:119)\n\tat io.smallrye.mutiny.subscription.MultiSubscriber.onError(MultiSubscriber.java:73)\n\tat io.smallrye.mutiny.converters.uni.UniToMultiPublisher$UniToMultiSubscription.onFailure(UniToMultiPublisher.java:103)\n\tat io.smallrye.mutiny.operators.uni.UniOperatorProcessor.onFailure(UniOperatorProcessor.java:54)\n\tat io.smallrye.mutiny.vertx.AsyncResultUni.lambda$subscribe$1(AsyncResultUni.java:37)\n\tat io.smallrye.mutiny.vertx.DelegatingHandler.handle(DelegatingHandler.java:25)\n\tat io.vertx.ext.web.client.impl.HttpContext.handleFailure(HttpContext.java:386)\n\tat io.vertx.ext.web.client.impl.HttpContext.execute(HttpContext.java:380)\n\tat io.vertx.ext.web.client.impl.HttpContext.next(HttpContext.java:355)\n\tat io.vertx.ext.web.client.impl.HttpContext.fire(HttpContext.java:322)\n\tat io.vertx.ext.web.client.impl.HttpContext.fail(HttpContext.java:303)\n\tat io.vertx.ext.web.client.impl.HttpContext.lambda$handleCreateRequest$6(HttpContext.java:486)\n\tat io.vertx.core.impl.future.FutureImpl$3.onFailure(FutureImpl.java:153)\n\tat io.vertx.core.impl.future.FutureBase.emitFailure(FutureBase.java:75)\n\tat io.vertx.core.impl.future.FutureImpl.tryFail(FutureImpl.java:230)\n\tat io.vertx.core.impl.future.PromiseImpl.tryFail(PromiseImpl.java:23)\n\tat io.vertx.core.http.impl.HttpClientImpl.lambda$doRequest$8(HttpClientImpl.java:645)\n\tat io.vertx.core.net.impl.pool.Endpoint.lambda$getConnection$0(Endpoint.java:52)\n\tat io.vertx.core.http.impl.SharedClientHttpStreamEndpoint$Request.handle(SharedClientHttpStreamEndpoint.java:162)\n\tat io.vertx.core.http.impl.SharedClientHttpStreamEndpoint$Request.handle(SharedClientHttpStreamEndpoint.java:123)\n\tat io.vertx.core.impl.EventLoopContext.emit(EventLoopContext.java:55)\n\tat io.vertx.core.impl.ContextBase.emit(ContextBase.java:239)\n\tat io.vertx.core.net.impl.pool.SimpleConnectionPool$ConnectFailed$1.run(SimpleConnectionPool.java:384)\n\tat io.vertx.core.net.impl.pool.CombinerExecutor.submit(CombinerExecutor.java:50)\n\tat io.vertx.core.net.impl.pool.SimpleConnectionPool.execute(SimpleConnectionPool.java:245)\n\tat io.vertx.core.net.impl.pool.SimpleConnectionPool.lambda$connect$2(SimpleConnectionPool.java:259)\n\tat io.vertx.core.http.impl.SharedClientHttpStreamEndpoint.lambda$connect$2(SharedClientHttpStreamEndpoint.java:104)\n\tat io.vertx.core.impl.future.FutureImpl$3.onFailure(FutureImpl.java:153)\n\tat io.vertx.core.impl.future.FutureBase.emitFailure(FutureBase.java:75)\n\tat io.vertx.core.impl.future.FutureImpl.tryFail(FutureImpl.java:230)\n\tat io.vertx.core.impl.future.Composition$1.onFailure(Composition.java:66)\n\tat io.vertx.core.impl.future.FutureBase.emitFailure(FutureBase.java:75)\n\tat io.vertx.core.impl.future.FailedFuture.addListener(FailedFuture.java:98)\n\tat io.vertx.core.impl.future.Composition.onFailure(Composition.java:55)\n\tat io.vertx.core.impl.future.FutureBase.emitFailure(FutureBase.java:75)\n\tat io.vertx.core.impl.future.FutureImpl.tryFail(FutureImpl.java:230)\n\tat io.vertx.core.impl.future.PromiseImpl.tryFail(PromiseImpl.java:23)\n\tat io.vertx.core.impl.EventLoopContext.emit(EventLoopContext.java:55)\n\tat io.vertx.core.impl.ContextBase.emit(ContextBase.java:239)\n\tat io.vertx.core.net.impl.NetClientImpl.failed(NetClientImpl.java:302)\n\tat io.vertx.core.net.impl.NetClientImpl.lambda$connectInternal$4(NetClientImpl.java:270)\n\tat io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:578)\n\tat io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:552)\n\tat io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:491)\n\tat io.netty.util.concurrent.DefaultPromise.addListener(DefaultPromise.java:184)\n\tat io.netty.util.concurrent.DefaultPromise.addListener(DefaultPromise.java:35)\n\tat io.vertx.core.net.impl.NetClientImpl.connectInternal(NetClientImpl.java:256)\n\tat io.vertx.core.net.impl.NetClientImpl.lambda$connectInternal$5(NetClientImpl.java:275)\n\tat io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:174)\n\tat io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:167)\n\tat io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470)\n\tat io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:503)\n\tat io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)\n\tat io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)\n\tat io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)\n\tat java.base/java.lang.Thread.run(Thread.java:833)\n"}
{"timestamp":"2022-09-02T15:22:14.351Z","sequence":100,"loggerClassName":"org.jboss.logging.Logger","loggerName":"io.quarkus.runtime.Application","level":"DEBUG","message":"Stopping application","threadName":"main","threadId":1,"mdc":{},"ndc":"","hostName":"powo-backend-staging-7c8f5f7787-vf7xb","processName":"quarkus-run.jar","processId":1}
{"timestamp":"2022-09-02T15:22:14.352Z","sequence":101,"loggerClassName":"org.jboss.logging.Logger","loggerName":"io.quarkus.runtime.Application","level":"DEBUG","message":"Shutting down with exit code 1","threadName":"main","threadId":1,"mdc":{},"ndc":"","hostName":"powo-backend-staging-7c8f5f7787-vf7xb","processName":"quarkus-run.jar","processId":1}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants