Hornetq hangs during benchmark tests.

Version 1

    Hello, I was trying to reproduce production behavior using Hornetq on my machine.

    I've sent 1'000'000 messages to consumer. Sending process was fine, but when messages were handled

    during 12 hours, the hornetq stopped answering and has very strange status now.

    It seems that it is not died but not alive   I'm using Hornetq standalone.

    Also I'm using getMessageByCorrelationId mechanism the name is this queue is inprogress. All these messages are stored in very high size queue with blocking access type.

    So it seems that client becomes slow, when this queue contains more than 200'000 records.

    Thank you in advance for your advice

     

    Here is hornetq configuration:

          <address-setting match="jms.queue.*IN_PROGRESS"> <---- in progress queue.

             <dead-letter-address>jms.queue.DLQ</dead-letter-address>

             <expiry-address>jms.queue.ExpiryQueue</expiry-address>

             <redelivery-delay>5000</redelivery-delay>

             <max-size-bytes>5048576000</max-size-bytes>

      <!--<page-size-bytes>10485760</page-size-bytes> -->    

             <message-counter-history-day-limit>10</message-counter-history-day-limit>

      

             <address-full-policy>BLOCK</address-full-policy>

             <max-delivery-attempts>-1</max-delivery-attempts>

          </address-setting>

     

          <!--default for catch all-->

          <address-setting match="#">

             <dead-letter-address>jms.queue.DLQ</dead-letter-address>

             <expiry-address>jms.queue.ExpiryQueue</expiry-address>

             <redelivery-delay>5000</redelivery-delay>

             <max-size-bytes>50485760</max-size-bytes>

      <page-size-bytes>10485760</page-size-bytes>     

             <message-counter-history-day-limit>10</message-counter-history-day-limit>

      

             <address-full-policy>PAGE</address-full-policy>

             <max-delivery-attempts>-1</max-delivery-attempts>

          </address-setting>

     

     

     

    So here is last Hornetq log record:

    10:19:40,060 WARN  [org.hornetq.core.client] HQ212037: Connection failure has be

    en detected: HQ119014: Did not receive data from null. It is likely the client h

    as exited or crashed without closing its connection, or the network between the

    server and client has failed. You also might have configured connection-ttl and

    client-failure-check-period incorrectly. Please check user manual for more infor

    mation. The connection will now be closed. [code=CONNECTION_TIMEDOUT]

     

    And this is from client tomcat version

    [47298652] [Thread-60 (HornetQ-client-global-threads-122852646)] 07:42:17,142 ERROR CoreProcessor.traceMessage - CorrID=9d12f370-d3d0-4dec-acfd-60b66803cac1 , com.ipr.crystal.service.processors.PreProcessor error, message [[Payload String content=<IPR1><IPR>829181</IPR></IPR1>][Headers={JMSCorrelationID=9d12f370-d3d0-4dec-acfd-60b66803cac1, CRY_DELIVERY=PERSISTENT, CRY_SYSTEM_NAME=TEST_SYSTEM, CRY_MSG_TYPE=MESSAGE, CRY_IGNORE_INP=false, _HQ_DUPL_ID=8c067ae5-9d94-1ae1-e613-751d84b21d00, CRY_QUERY_NAME=QUERY, JMSXDeliveryCount=1, CRY_ROUTING_ID=37, CRY_REPLY_TO=Crystal!TEST_SERVICE_REPLY_TO, CRY_QUERY_VERSION=1.0, id=9ae3815a-7b95-a961-668c-fb42fbb817e6, timestamp=1423114937142}]]

    org.springframework.messaging.MessageDeliveryException: failed to send Message to channel 'null'

      at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:263)

      at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:223)

      at com.ipr.crystal.service.processors.PreProcessor.processMessage(PreProcessor.java:74)

      at com.ipr.crystal.config.JMSMessageDelegator.onMessage(JMSMessageDelegator.java:35)

      at org.springframework.jms.listener.AbstractMessageListenerContainer.doInvokeListener(AbstractMessageListenerContainer.java:562)

      at org.springframework.jms.listener.AbstractMessageListenerContainer.invokeListener(AbstractMessageListenerContainer.java:500)

      at org.springframework.jms.listener.AbstractMessageListenerContainer.doExecuteListener(AbstractMessageListenerContainer.java:468)

      at org.springframework.jms.listener.AbstractMessageListenerContainer.executeListener(AbstractMessageListenerContainer.java:440)

      at org.springframework.jms.listener.SimpleMessageListenerContainer.processMessage(SimpleMessageListenerContainer.java:344)

      at org.springframework.jms.listener.SimpleMessageListenerContainer$2.onMessage(SimpleMessageListenerContainer.java:320)

      at org.hornetq.jms.client.JMSMessageListenerWrapper.onMessage(JMSMessageListenerWrapper.java:103)

      at org.hornetq.core.client.impl.ClientConsumerImpl.callOnMessage(ClientConsumerImpl.java:1116)

      at org.hornetq.core.client.impl.ClientConsumerImpl.access$500(ClientConsumerImpl.java:56)

      at org.hornetq.core.client.impl.ClientConsumerImpl$Runner.run(ClientConsumerImpl.java:1251)

      at org.hornetq.utils.OrderedExecutorFactory$OrderedExecutor$1.run(OrderedExecutorFactory.java:104)

      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

      at java.lang.Thread.run(Thread.java:745)

    Caused by: org.springframework.jms.UncategorizedJmsException: Uncategorized exception occured during JMS processing; nested exception is javax.jms.JMSException: HQ119014: Timed out waiting for response when sending packet 45

      at org.springframework.jms.support.JmsUtils.convertJmsAccessException(JmsUtils.java:316)

      at org.springframework.jms.support.JmsAccessor.convertJmsAccessException(JmsAccessor.java:169)

      at org.springframework.jms.core.JmsTemplate.execute(JmsTemplate.java:494)

      at org.springframework.jms.core.JmsTemplate.send(JmsTemplate.java:577)

      at com.ipr.crystal.config.JMSMessageSender.handleMessage(JMSMessageSender.java:45)

      at org.springframework.integration.dispatcher.AbstractDispatcher.tryOptimizedDispatch(AbstractDispatcher.java:116)

      at org.springframework.integration.dispatcher.UnicastingDispatcher.doDispatch(UnicastingDispatcher.java:101)

      at org.springframework.integration.dispatcher.UnicastingDispatcher.dispatch(UnicastingDispatcher.java:97)

      at org.springframework.integration.channel.AbstractSubscribableChannel.doSend(AbstractSubscribableChannel.java:77)

      at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:255)

      ... 17 more

    Caused by: javax.jms.JMSException: HQ119014: Timed out waiting for response when sending packet 45

      at org.hornetq.core.protocol.core.impl.ChannelImpl.sendBlocking(ChannelImpl.java:390)

      at org.hornetq.core.client.impl.ClientSessionImpl.queueQuery(ClientSessionImpl.java:427)

      at org.hornetq.core.client.impl.DelegatingSession.queueQuery(DelegatingSession.java:464)

      at org.hornetq.jms.client.HornetQSession.lookupQueue(HornetQSession.java:1238)

      at org.hornetq.jms.client.HornetQSession.createQueue(HornetQSession.java:388)

      at org.springframework.jms.support.destination.DynamicDestinationResolver.resolveQueue(DynamicDestinationResolver.java:102)

      at org.springframework.jms.support.destination.DynamicDestinationResolver.resolveDestinationName(DynamicDestinationResolver.java:67)

      at org.springframework.jms.support.destination.JmsDestinationAccessor.resolveDestinationName(JmsDestinationAccessor.java:100)

      at org.springframework.jms.core.JmsTemplate.access$200(JmsTemplate.java:88)

      at org.springframework.jms.core.JmsTemplate$4.doInJms(JmsTemplate.java:580)

      at org.springframework.jms.core.JmsTemplate.execute(JmsTemplate.java:491)

      ... 24 more

    Caused by: HornetQConnectionTimedOutException[errorType=CONNECTION_TIMEDOUT message=HQ119014: Timed out waiting for response when sending packet 45]

      ... 35 more

    [47298699] [Thread-60 (HornetQ-client-global-threads-122852646)] 07:42:17,189 TRACE PreProcessor.processMessage - CorrID=9d12f370-d3d0-4dec-acfd-60b66803cac1, Pre CoreProcessor execution elapsed time : [56513]