4 Replies Latest reply on Jan 28, 2013 6:46 PM by zekela104

    Unable to achieve message concurrency with hornetq server

    zekela104 Newbie

      Hi all:

       

      We have ported an existing production application from JBoss 4 to JBoss 7. Our application uses org.springframework.jms.listener.DefaultMessageListenerContainer to consume JMS queue messages.

       

      We have noticed in JBoss 7 that our JMS messages are being processed sequentially, rather than concurrently, even though we have set the maxConcurrentConsumers to 8. We have also set sessionTransacted to both true and false and witnessed the same behavior. In JBoss 4, our messages are processed concurrently, as expected

       

      We have not changed our application. The only difference is that we are now using JBoss 7 and the hornetq server for JMS.

       

      We are using the default out-of-the box standalone-full.xml hornetq server configuration. We believe that the hornetq server must support concurrent processing, though we don't know why we're only seeing sequential behavior.

       

      We have debugged the Spring JMS framework and have verified that messages are only received when the previous one has been processed.

       

      At this point we are out of ideas.

       

      Any help is greatly appreciated.

       

      Sincerely,

      Steve

        • 1. Re: Unable to achieve message concurrency with hornetq server
          Justin Bertram Master

          The default HornetQ configuration in JBoss AS7 should allow concurrent message processing.  Whether or not your session is transacted shouldn't really matter.  I process messages concurrently all the time using MDBs.  I can't vouch for Spring components since I don't have much experience with them.  Could you provide me with a test-case I could use to reproduce the problem simply and efficiently?

          • 2. Re: Unable to achieve message concurrency with hornetq server
            zekela104 Newbie

            Hi Justin,

             

            I have a simple test harness, which I ran on both JBoss 4/JBoss MQ and JBoss 7/HornetQ. The results are not the same. On JBoss 4, I get the expected result. On JBoss 7, I get the same behavior as I am seeing in my production application. Specifically, the listener threads wait to dispatch messages, even though messages are available in the queue. I am using the spring DefaultMessageListenerContainer similar to how it is used in the examples located in hornetq-2.2.14.Final\examples\jms\spring-integration directory.

             

            I used my simple test harness to first post 60 messages to a jms test queue without a configured listener. I then used my simple test harness with the listener bean configured, which is setup to use 8 threads. The listener thread implementation simply sleeps for 10 seconds when it is dispatched by the spring framework. I notice that after the first 8 messages are read and processed by the listener threads that only four messages are read and processed. Specifically, listener thread 1 reads at time 0, takes 10 seconds to process the message, finishes at time 10, and is dispatched again at time 20, even though there are pending messages on the queue.

             

            Here's the log4j output snippet from my simple test harness run.

             

            11:37:20,592 INFO  [ExampleMessageListener] (MessageListenerContainer-5) received This is message 5
            11:37:20,593 INFO  [ExampleMessageListener] (MessageListenerContainer-4) received This is message 3
            11:37:20,592 INFO  [ExampleMessageListener] (MessageListenerContainer-8) received This is message 2
            11:37:20,592 INFO  [ExampleMessageListener] (MessageListenerContainer-6) received This is message 4
            11:37:20,593 INFO  [ExampleMessageListener] (MessageListenerContainer-1) received This is message 7
            11:37:20,592 INFO  [ExampleMessageListener] (MessageListenerContainer-2) received This is message 6
            11:37:20,592 INFO  [ExampleMessageListener] (MessageListenerContainer-3) received This is message 1
            11:37:20,592 INFO  [ExampleMessageListener] (MessageListenerContainer-7) received This is message 0
            11:37:30,594 INFO  [ExampleMessageListener] (MessageListenerContainer-5)           processed This is message 5
            11:37:30,596 INFO  [ExampleMessageListener] (MessageListenerContainer-4)           processed This is message 3
            11:37:30,597 INFO  [ExampleMessageListener] (MessageListenerContainer-8)           processed This is message 2
            11:37:30,598 INFO  [ExampleMessageListener] (MessageListenerContainer-6)           processed This is message 4
            11:37:30,599 INFO  [ExampleMessageListener] (MessageListenerContainer-1)           processed This is message 7
            11:37:30,601 INFO  [ExampleMessageListener] (MessageListenerContainer-3)           processed This is message 1
            11:37:30,600 INFO  [ExampleMessageListener] (MessageListenerContainer-2)           processed This is message 6
            11:37:30,602 INFO  [ExampleMessageListener] (MessageListenerContainer-7)           processed This is message 0
            11:37:30,626 INFO  [ExampleMessageListener] (MessageListenerContainer-3) received This is message 12
            11:37:30,626 INFO  [ExampleMessageListener] (MessageListenerContainer-4) received This is message 20
            11:37:30,627 INFO  [ExampleMessageListener] (MessageListenerContainer-5) received This is message 57
            11:37:30,628 INFO  [ExampleMessageListener] (MessageListenerContainer-2) received This is message 35
            11:37:40,628 INFO  [ExampleMessageListener] (MessageListenerContainer-3)           processed This is message 12
            11:37:40,629 INFO  [ExampleMessageListener] (MessageListenerContainer-4)           processed This is message 20
            11:37:40,631 INFO  [ExampleMessageListener] (MessageListenerContainer-2)           processed This is message 35
            11:37:40,630 INFO  [ExampleMessageListener] (MessageListenerContainer-5)           processed This is message 57
            11:37:40,634 INFO  [ExampleMessageListener] (MessageListenerContainer-8) received This is message 44
            11:37:40,634 INFO  [ExampleMessageListener] (MessageListenerContainer-7) received This is message 28
            11:37:40,635 INFO  [ExampleMessageListener] (MessageListenerContainer-1) received This is message 25
            11:37:40,635 INFO  [ExampleMessageListener] (MessageListenerContainer-6) received This is message 9
            11:37:40,640 INFO  [ExampleMessageListener] (MessageListenerContainer-3) received This is message 55

             

            The simple test harness consists of a five java classes (including a servlet to populate the test queue), the spring xml configuration file, and a web.xml file. I can make these available to you. Please advise on the best way for me to do this.

             

            I have spent several hours pouring over this issue. I am leaving open the possiblity that I have mis-characterized and mis-diagnosed this issue; however, the best understanding I have at this time leads me to conclude that the message receipt latency is introduced by hornetq.

             

            Thank you and sincerely,

            Steve

            • 3. Re: Unable to achieve message concurrency with hornetq server
              Justin Bertram Master

              It's possible this is caused by buffering on the client.  Try setting the consumer-window-size to 0.  If that doesn't help then attach a reproducible test-case directly to the thread here (you need to click the "Use advanced editor" link).

              • 4. Re: Unable to achieve message concurrency with hornetq server
                zekela104 Newbie

                Justin,

                 

                That did the trick: setting the consumer window size to zero. Thanks for all of your help on this one. I was able to get the results I desired in both the simple test harness and in the application.

                 

                Sincerely,

                Steve