Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Epoll client's connect future with ALLOW_HALF_CLOSURE enabled reports success on failed connection #3785

Closed
rkapsi opened this issue May 15, 2015 · 9 comments
Assignees
Labels
Milestone

Comments

@rkapsi
Copy link
Member

rkapsi commented May 15, 2015

The connect ChannelFuture of an Epoll client that has ALLOW_HALF_CLOSURE enabled reports success and doesn't raise an Exception. The Channel says it's active and even the ChannelHandler's methods such as channelActive() get called.

There is an exception under the hood (see below) but it's not getting exposed to the user. An unit test is attached (see below).

Netty 4.1.0-Beta6-SNAPSHOT

$ java -version
java version "1.8.0_40"
Java(TM) SE Runtime Environment (build 1.8.0_40-b25)
Java HotSpot(TM) 64-Bit Server VM (build 25.40-b25, mixed mode)

$ uname -a
Linux ardverk 3.19.3-1-ARCH #1 SMP PREEMPT Thu Mar 26 14:56:16 CET 2015 x86_64 GNU/Linux

Exception

[epollEventLoopGroup-2-1] WARN io.netty.channel.DefaultChannelPipeline - An exceptionCaught() event was fired, and it reached at the tail of the pipeline. It usually means the last handler in the pipeline did not handle the exception.
java.io.IOException: readAddress() failed: Connection refused
    at io.netty.channel.epoll.Native.newIOException(Native.java:127)
    at io.netty.channel.epoll.Native.ioResult(Native.java:143)
    at io.netty.channel.epoll.Native.readAddress(Native.java:284)
    at io.netty.channel.epoll.AbstractEpollChannel.doReadBytes(AbstractEpollChannel.java:235)
    at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:823)
    at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:322)
    at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:254)
    at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:703)
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137)
    at java.lang.Thread.run(Thread.java:745)

Unit Test

public class EpollHalfClosedConnectTest {

  @Test(expected=ConnectException.class)
  public void useNioHalfClosedConnect() throws Exception {
    connect(false);
  }

  @Test(expected=ClosedChannelException.class)
  public void useEpollHalfClosedConnect() throws Exception {
    connect(true);
  }

  private void connect(boolean useEpoll) throws Exception {
    Class<? extends SocketChannel> channelType
        = useEpoll ? EpollSocketChannel.class : NioSocketChannel.class;

    EventLoopGroup group = useEpoll 
        ? new EpollEventLoopGroup() 
        : new NioEventLoopGroup();

    ChannelHandler handler = new ChannelInboundHandlerAdapter() {
      @Override
      public void channelActive(ChannelHandlerContext ctx) throws Exception {
        Channel channel = ctx.channel();
        fail("Should have never been called: " + channel);
      }
    };

    try {
      Bootstrap bootstrap = new Bootstrap();

      bootstrap.group(group)
        .channel(channelType)
        .option(ChannelOption.ALLOW_HALF_CLOSURE, true)
        .handler(handler);

      ChannelFuture connectFuture = bootstrap.connect("localhost", 9999)
          .syncUninterruptibly();

      assertTrue(connectFuture.isDone());
      assertFalse(connectFuture.isSuccess());

    } finally {
      group.shutdownGracefully().awaitUninterruptibly();
    }
  }
}

Log Output

[epollEventLoopGroup-2-1] WARN io.netty.channel.DefaultChannelPipeline - An exceptionCaught() event was fired, and it reached at the tail of the pipeline. It usually means the last handler in the pipeline did not handle the exception.
java.io.IOException: readAddress() failed: Connection refused
    at io.netty.channel.epoll.Native.newIOException(Native.java:127)
    at io.netty.channel.epoll.Native.ioResult(Native.java:143)
    at io.netty.channel.epoll.Native.readAddress(Native.java:284)
    at io.netty.channel.epoll.AbstractEpollChannel.doReadBytes(AbstractEpollChannel.java:235)
    at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:823)
    at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:322)
    at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:254)
    at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:703)
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137)
    at java.lang.Thread.run(Thread.java:745)
[epollEventLoopGroup-2-1] WARN io.netty.channel.DefaultChannelPipeline - An exceptionCaught() event was fired, and it reached at the tail of the pipeline. It usually means the last handler in the pipeline did not handle the exception.
java.lang.AssertionError: Should have never been called: [id: 0x88d3ca97, /127.0.0.1:45718]
    at org.testng.AssertJUnit.fail(AssertJUnit.java:59)
    at io.netty.channel.EpollHalfClosedConnectTest$1.channelActive(EpollHalfClosedConnectTest.java:43)
    at io.netty.channel.ChannelHandlerInvokerUtil.invokeChannelActiveNow(ChannelHandlerInvokerUtil.java:48)
    at io.netty.channel.DefaultChannelHandlerInvoker.invokeChannelActive(DefaultChannelHandlerInvoker.java:78)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelActive(AbstractChannelHandlerContext.java:128)
    at io.netty.channel.DefaultChannelPipeline.fireChannelActive(DefaultChannelPipeline.java:917)
    at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.fulfillConnectPromise(AbstractEpollStreamChannel.java:695)
    at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.finishConnect(AbstractEpollStreamChannel.java:728)
    at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollOutReady(AbstractEpollStreamChannel.java:747)
    at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:326)
    at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:254)
    at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:703)
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137)
    at java.lang.Thread.run(Thread.java:745)
@Scottmitch
Copy link
Member

@normanmaurer - I'll give this to you :)

@normanmaurer
Copy link
Member

@Scottmitch you are so good too me ;)

@normanmaurer normanmaurer added this to the 4.0.29.Final milestone May 15, 2015
@normanmaurer
Copy link
Member

@rkapsi do you grant us to include your test-case into netty ?

@rkapsi
Copy link
Member Author

rkapsi commented May 26, 2015

@normanmaurer - yes of course.

@normanmaurer
Copy link
Member

Thanks!

Working on a fix

Am 26.05.2015 um 16:58 schrieb Roger Kapsi notifications@github.com:

@normanmaurer - yes of course.


Reply to this email directly or view it on GitHub.

normanmaurer added a commit that referenced this issue May 27, 2015
Motivation:

Due a bug we not correctly handled connection refused errors and so failed the connect promise with the wrong exception.
Beside this we some times even triggered fireChannelActive() which is not correct.

Modifications:

- Add testcase
- correctly detect connect errors

Result:

Correct and consistent handling.
@normanmaurer
Copy link
Member

@rkapsi please check the fix #3832

normanmaurer added a commit that referenced this issue May 27, 2015
Motivation:

Due a bug we not correctly handled connection refused errors and so failed the connect promise with the wrong exception.
Beside this we some times even triggered fireChannelActive() which is not correct.

Modifications:

- Add testcase
- correctly detect connect errors

Result:

Correct and consistent handling.
normanmaurer added a commit that referenced this issue May 27, 2015
Motivation:

Due a bug we not correctly handled connection refused errors and so failed the connect promise with the wrong exception.
Beside this we some times even triggered fireChannelActive() which is not correct.

Modifications:

- Add testcase
- correctly detect connect errors

Result:

Correct and consistent handling.
normanmaurer added a commit that referenced this issue May 27, 2015
Motivation:

Due a bug we not correctly handled connection refused errors and so failed the connect promise with the wrong exception.
Beside this we some times even triggered fireChannelActive() which is not correct.

Modifications:

- Add testcase
- correctly detect connect errors

Result:

Correct and consistent handling.
normanmaurer added a commit that referenced this issue May 27, 2015
Motivation:

Due a bug we not correctly handled connection refused errors and so failed the connect promise with the wrong exception.
Beside this we some times even triggered fireChannelActive() which is not correct.

Modifications:

- Add testcase
- correctly detect connect errors

Result:

Correct and consistent handling.
@normanmaurer
Copy link
Member

@rkapsi should be fixed now... Please reopen if you still think it is an issue. Thanks again for reporting and the test-case!

@rkapsi
Copy link
Member Author

rkapsi commented May 27, 2015

@normanmaurer: confirm. My canary test started passing (or rather failing) as soon as the new SNAPSHOT went live. Thanks!

@normanmaurer
Copy link
Member

Awesome. Thanks!

Am 27.05.2015 um 15:43 schrieb Roger Kapsi notifications@github.com:

@normanmaurer: confirm. My canary test started passing (or rather failing) as soon as the new SNAPSHOT went live. Thanks!


Reply to this email directly or view it on GitHub.

pulllock pushed a commit to pulllock/netty that referenced this issue Oct 19, 2023
Motivation:

Due a bug we not correctly handled connection refused errors and so failed the connect promise with the wrong exception.
Beside this we some times even triggered fireChannelActive() which is not correct.

Modifications:

- Add testcase
- correctly detect connect errors

Result:

Correct and consistent handling.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants