0 Replies Latest reply on Feb 1, 2013 9:13 PM by atulkc

    JMS Connection failure due to NIO buffers being full

    atulkc

      Hi All,

       

      In our project we are experiencing JMS connection failures due to NIO buffers getting full.

       

      Here is the stack trace:

       

      javax.jms.JMSException: HornetQException[errorCode=0 message=Netty exception]

      at org.hornetq.jms.client.HornetQConnection$JMSFailureListener.connectionFailed(HornetQConnection.java:655)

      at org.hornetq.core.client.impl.ClientSessionFactoryImpl.callFailureListeners(ClientSessionFactoryImpl.java:906)

      at org.hornetq.core.client.impl.ClientSessionFactoryImpl.failoverOrReconnect(ClientSessionFactoryImpl.java:698)

      at org.hornetq.core.client.impl.ClientSessionFactoryImpl.handleConnectionFailure(ClientSessionFactoryImpl.java:557)

      at org.hornetq.core.client.impl.ClientSessionFactoryImpl.connectionException(ClientSessionFactoryImpl.java:395)

      at org.hornetq.core.remoting.impl.netty.NettyConnector$Listener$2.run(NettyConnector.java:728)

      at org.hornetq.utils.OrderedExecutorFactory$OrderedExecutor$1.run(OrderedExecutorFactory.java:100)

      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)

      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)

      at java.lang.Thread.run(Thread.java:722)

      Caused by: HornetQException[errorCode=0 message=Netty exception]

      at org.hornetq.core.remoting.impl.netty.HornetQChannelHandler.exceptionCaught(HornetQChannelHandler.java:108)

      at org.jboss.netty.channel.SimpleChannelHandler.handleUpstream(SimpleChannelHandler.java:142)

      at org.jboss.netty.channel.StaticChannelPipeline.sendUpstream(StaticChannelPipeline.java:372)

      at org.jboss.netty.channel.StaticChannelPipeline$StaticChannelHandlerContext.sendUpstream(StaticChannelPipeline.java:534)

      at org.jboss.netty.channel.SimpleChannelUpstreamHandler.exceptionCaught(SimpleChannelUpstreamHandler.java:148)

      at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:122)

      at org.jboss.netty.channel.StaticChannelPipeline.sendUpstream(StaticChannelPipeline.java:372)

      at org.jboss.netty.channel.StaticChannelPipeline$StaticChannelHandlerContext.sendUpstream(StaticChannelPipeline.java:534)

      at org.jboss.netty.handler.ssl.SslHandler.exceptionCaught(SslHandler.java:510)

      at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:122)

      at org.jboss.netty.channel.StaticChannelPipeline.sendUpstream(StaticChannelPipeline.java:372)

      at org.jboss.netty.channel.StaticChannelPipeline.sendUpstream(StaticChannelPipeline.java:367)

      at org.jboss.netty.channel.Channels.fireExceptionCaught(Channels.java:432)

      at org.jboss.netty.channel.socket.nio.NioWorker.write0(NioWorker.java:510)

      at org.jboss.netty.channel.socket.nio.NioWorker.writeFromTaskLoop(NioWorker.java:393)

      at org.jboss.netty.channel.socket.nio.NioSocketChannel$WriteTask.run(NioSocketChannel.java:269)

      at org.jboss.netty.channel.socket.nio.NioWorker.processWriteTaskQueue(NioWorker.java:268)

      at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:199)

      at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)

      at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:44)

      at org.jboss.netty.util.VirtualExecutorService$ChildExecutorRunnable.run(VirtualExecutorService.java:181)

      ... 3 more

      Caused by: java.io.IOException: An operation on a socket could not be performed because the system lacked sufficient buffer space or because a queue was full

      at sun.nio.ch.SocketDispatcher.write0(Native Method)

      at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:51)

      at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:94)

      at sun.nio.ch.IOUtil.write(IOUtil.java:51)

      at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:450)

      at org.jboss.netty.channel.socket.nio.SocketSendBufferPool$PooledSendBuffer.transferTo(SocketSendBufferPool.java:239)

      at org.jboss.netty.channel.socket.nio.NioWorker.write0(NioWorker.java:470)

      at org.jboss.netty.channel.socket.nio.NioWorker.writeFromTaskLoop(NioWorker.java:393)

      at org.jboss.netty.channel.socket.nio.NioSocketChannel$WriteTask.run(NioSocketChannel.java:269)

      at org.jboss.netty.channel.socket.nio.NioWorker.processWriteTaskQueue(NioWorker.java:268)

      at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:199)

      ... 4 more

       

      Our environment is as follows:

      JBoss AS: 5.1.0 GA

      HornetQ: 2.2.14

      OS: Windows Server 2008 R2 Enterprise (64 bit)

       

      We have about 33 topics and 10 queues. Couple of topics have heavy load (about 12K messages per minute). The topics with heavy load mostly have large messages (greater than 102400 bytes, which is our min-large-message-size setting). All the publishers to these 33 topics use same JMS connection on server. Closing of JMS connection causes our application to restart and this is becoming a major problem for us (although this problem is not very frequent but it reduces the reliability of our application). We have registered ExceptionListener on the connection, which restarts the server. This is done because of all of our JMS publishers use this connection and are singleton. If we reestablish the JMS connection we need to reinitialize these publishers, which would need some re-designing (far fetched at this time in release).

       

      Just before this exception is thrown I am seeing following messages in our log files:

      2013-01-26 10:50:38,553 WARN  [org.hornetq.core.protocol.core.impl.HornetQPacketHandler] (New I/O server worker #1-5) Reattach request from /95.90.96.103:56894 failed as there is no confirmationWindowSize configured, which may be ok for your system

      2013-01-26 10:50:38,554 WARN  [org.hornetq.core.protocol.core.impl.HornetQPacketHandler] (New I/O server worker #1-5) Reattach request from /95.90.96.103:56894 failed as there is no confirmationWindowSize configured, which may be ok for your system

      2013-01-26 10:50:38,555 WARN  [org.hornetq.core.protocol.core.impl.HornetQPacketHandler] (New I/O server worker #1-5) Reattach request from /95.90.96.103:56894 failed as there is no confirmationWindowSize configured, which may be ok for your system

      2013-01-26 10:50:38,561 WARN  [org.hornetq.core.protocol.core.impl.HornetQPacketHandler] (New I/O server worker #1-5) Reattach request from /95.90.96.103:56894 failed as there is no confirmationWindowSize configured, which may be ok for your system

      2013-01-26 10:50:38,564 WARN  [org.hornetq.core.protocol.core.impl.HornetQPacketHandler] (New I/O server worker #1-5) Reattach request from /95.90.96.103:56894 failed as there is no confirmationWindowSize configured, which may be ok for your system

      2013-01-26 10:50:38,585 WARN  [org.hornetq.core.protocol.core.impl.HornetQPacketHandler] (New I/O server worker #1-5) Reattach request from /95.90.96.103:56894 failed as there is no confirmationWindowSize configured, which may be ok for your system

       

      I am also seeing an exception indicating that the session was in illegal state:

      [org.hornetq.core.protocol.core.ServerSessionPacketHandler] (New I/O server worker #1-5) Caught exception

      HornetQException[errorCode=104 message=large-message not initialized on server]

                at org.hornetq.core.server.impl.ServerSessionImpl.sendContinuations(ServerSessionImpl.java:1175)

                at org.hornetq.core.protocol.core.ServerSessionPacketHandler.handlePacket(ServerSessionPacketHandler.java:458)

                at org.hornetq.core.protocol.core.impl.ChannelImpl.handlePacket(ChannelImpl.java:508)

                at org.hornetq.core.protocol.core.impl.RemotingConnectionImpl.doBufferReceived(RemotingConnectionImpl.java:559)

                at org.hornetq.core.protocol.core.impl.RemotingConnectionImpl.bufferReceived(RemotingConnectionImpl.java:517)

                at org.hornetq.core.remoting.server.impl.RemotingServiceImpl$DelegatingBufferHandler.bufferReceived(RemotingServiceImpl.java:533)

                at org.hornetq.core.remoting.impl.netty.HornetQChannelHandler.messageReceived(HornetQChannelHandler.java:73)

                at org.jboss.netty.channel.SimpleChannelHandler.handleUpstream(SimpleChannelHandler.java:100)

                at org.jboss.netty.channel.StaticChannelPipeline.sendUpstream(StaticChannelPipeline.java:372)

                at org.jboss.netty.channel.StaticChannelPipeline$StaticChannelHandlerContext.sendUpstream(StaticChannelPipeline.java:534)

                at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:287)

                at org.hornetq.core.remoting.impl.netty.HornetQFrameDecoder2.messageReceived(HornetQFrameDecoder2.java:122)

                at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:80)

                at org.jboss.netty.channel.StaticChannelPipeline.sendUpstream(StaticChannelPipeline.java:372)

                at org.jboss.netty.channel.StaticChannelPipeline$StaticChannelHandlerContext.sendUpstream(StaticChannelPipeline.java:534)

                at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:302)

                at org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:317)

                at org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:299)

                at org.jboss.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:214)

                at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:80)

                at org.jboss.netty.channel.StaticChannelPipeline.sendUpstream(StaticChannelPipeline.java:372)

                at org.jboss.netty.channel.StaticChannelPipeline.sendUpstream(StaticChannelPipeline.java:367)

                at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:274)

                at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:261)

                at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:349)

                at org.jboss.netty.channel.socket.nio.NioWorker.processSelectedKeys(NioWorker.java:280)

                at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:200)

                at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)

                at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:44)

                at org.jboss.netty.util.VirtualExecutorService$ChildExecutorRunnable.run(VirtualExecutorService.java:181)

                at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)

                at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)

                at java.lang.Thread.run(Thread.java:722)

       

       

      I am attaching the hornetq-configuration.xml and hornetq-jms.xml (removed the topics and queues from hornetq-jms for posting purpose).

       

      Has anyone this error before and have any idea what might be going on here? Any help/pointer on how to proceed or where to look is highly appreciated.

       

      Thanks,

      Atul