Skip to content
This repository has been archived by the owner on Sep 18, 2021. It is now read-only.

java.lang.IllegalStateException: await*() #34

Closed
gabrielhof opened this issue May 24, 2013 · 4 comments
Closed

java.lang.IllegalStateException: await*() #34

gabrielhof opened this issue May 24, 2013 · 4 comments

Comments

@gabrielhof
Copy link

When I tried to bind with a SMSC using DefaultSmppClient, this error ocurrs:

java.lang.IllegalStateException: await*() in I/O thread causes a dead lock or sudden performance drop. Use addListener() instead or call await*() from a different thread.
at org.jboss.netty.channel.DefaultChannelFuture.checkDeadLock(DefaultChannelFuture.java:343)
at org.jboss.netty.channel.DefaultChannelFuture.await0(DefaultChannelFuture.java:307)
at org.jboss.netty.channel.DefaultChannelFuture.await(DefaultChannelFuture.java:248)
at com.cloudhopper.smpp.impl.DefaultSmppClient.createConnectedChannel(DefaultSmppClient.java:266)
at com.cloudhopper.smpp.impl.DefaultSmppClient.doOpen(DefaultSmppClient.java:216)
at com.cloudhopper.smpp.impl.DefaultSmppClient.bind(DefaultSmppClient.java:185)

It seems a problem with the netty.io library. I've already google this issue, and the only solution it appeared was to set the deadLockChecker to false:

 DefaultChannelFuture.setUseDeadLockChecker(false);

But making this, when ever I try to submit a message, this error happens:

com.cloudhopper.smpp.type.SmppChannelException
at com.cloudhopper.smpp.impl.DefaultSmppSession.sendRequestPdu(DefaultSmppSession.java:526)
at com.cloudhopper.smpp.impl.DefaultSmppSession.sendRequestAndGetResponse(DefaultSmppSession.java:463)
at com.cloudhopper.smpp.impl.DefaultSmppSession.submit(DefaultSmppSession.java:446

Caused by: java.nio.channels.ClosedChannelException
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.cleanUpWriteBuffer(AbstractNioWorker.java:784)
at org.jboss.netty.channel.socket.nio.AbstractNioWorker.writeFromUserCode(AbstractNioWorker.java:507)
at org.jboss.netty.channel.socket.nio.NioClientSocketPipelineSink.eventSunk(NioClientSocketPipelineSink.java:125)
at com.cloudhopper.smpp.channel.SmppSessionLogger.handleDownstream(SmppSessionLogger.java:110)
at org.jboss.netty.channel.Channels.write(Channels.java:712)
at org.jboss.netty.channel.Channels.write(Channels.java:679)
at org.jboss.netty.channel.AbstractChannel.write(AbstractChannel.java:248)
at com.cloudhopper.smpp.impl.DefaultSmppSession.sendRequestPdu(DefaultSmppSession.java:521)
at com.cloudhopper.smpp.impl.DefaultSmppSession.sendRequestAndGetResponse(DefaultSmppSession.java:463)
at com.cloudhopper.smpp.impl.DefaultSmppSession.submit(DefaultSmppSession.java:446)

But when I check if the channel is really closed with session.isClosed(), it returns false.

@gabrielhof
Copy link
Author

gabrielhof@bc6b7fa

@jjlauer
Copy link
Contributor

jjlauer commented Jun 7, 2013

Interesting issue. Can you connect to the SMSC outside of this java library? Perhaps there is a firewall in-between that is immediately closing the socket is why I ask. Can you successfully connect to the socket on the other side via telnet?

@idpromnut
Copy link

I have noticed this when the library is used in the context of a Proxy (i.e. providing simultaneously a server and client interface). Specifically, I ran into this problem when an underlying thread receives a packet say from a server session, and attempts to send the PDU out via a "client" SmppSession. This behaviour stopped when I inserted a thread-pool + queue between the two (i.e. PDU path: Remote Client -> Server Class -> Queue+ThreadPool -> Client Class -> Remote Server). I needed the queue de-coupling so I didn't look too deeply into why this was occurring, but my feeling was that there is a resource sharing issue in either cloudhopper-smpp or netty.

@jjlauer
Copy link
Contributor

jjlauer commented Jun 7, 2013

Yes, your proxy example is true. NIO/netty is always difficult to engineer with since you need to be aware of the context of which thread is calling a method. The methods called upon reception of a PDU (either in a server or client session) are being called with a thread from the thread pool used by NIO/netty. Netty throws an IllegalStateException if that "receiving" thread is also now "sending" a PDU as well. Your solution of adding an intermediate queue + thread will get rid of the exception.

This isn't documented very well in the library -- but one must be pretty careful about handling PDUs as quickly as possible in any method that "receives" a PDU. Sending a PDU from that same thread context will trigger an exception from Netty, but actually any long blocking operation will significantly slow down performance as well. Those "receiving" threads from Netty are used to process incoming data from the socket pool -- and in order to maximize performance overall, its very important they are processed as fast as possible. I'd probably recommend always immediately throwing those PDUs into a queue and processing them with another thread.

@xgp xgp closed this as completed Jan 5, 2015
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants