3 Replies Latest reply on Apr 8, 2010 9:03 AM by rohit.macherla

    JBoss Tomcat high number of threads and slower application response

    rohit.macherla

      Hi all,


      I am using JBoss 4.2.3 server on a HP UX OS (4 core CPU, 16GB RAM). Our application contains JAX-WS WebServices (developed using NetBeans 6.1), which invoke other vendor based WebServices for getting most of the work done. The vendor based WebServices are also JAX-WS and are EAR files with the WSDL being http://clarityhost:8080/ClarityServiceManagement-war/ServiceManagementAPIService?wsdl

      The trouble is whenever a huge amount of requests flow in from external system in a short span of time (ex: 50 requests in 3 seconds), our server starts performing badly and wouldn't be able to take on extra load. All the later requests are piled up, the thread count increases to a very big number and the threads just hang in there (without getting killed). We have reported this to our vendor assuming there might be a table lock or something of that kind, but when we leave the application as it is without adding extra load, the number of busy threads come down slowly (after about 2 hrs) and hence rules away a deadlock scenario. In most of the cases, we had to restart our server so that the requests are processed again.

      • I have collected many statistics (attached in this ticket) and here are some things that I do not understand :
        Most of the threads are blocked on : java.net.SocketOutputStream.socketWrite0
             If the application at the other end is not responding, how come write's are blocked ?
      • The "Tomcat Status" document shows that the threads are waiting at :
             "CLARITYHOST GET /ClarityServiceManagement-war/ServiceManagementAPIService?wsdl HTTP/1.1"
             If the requests are made for invoking a webservice, why are they waiting on a GET method?
             I am under the impression that the status would be more appropriate had it been something like:
             "CLARITYHOST POST /ClarityServiceManagement-war/ServiceManagementAPIService HTTP/1.1"
             Additionally, I am also under the impression that when the request is posted to a WebService, the tomcat status of the thread shouldn't contain the      status element as : /ClarityServiceManagement-war/ServiceManagementAPIService?wsdl HTTP/1.1
             but rather : /ClarityServiceManagement-war/ServiceManagementAPIService HTTP/1.1 (without the '?wsdl' part)
             I now suspect if the threads are waiting to read the WSDL file and the vendor service may be actually innocent.

      Sometimes we also notice the following errors during high load (apart from the regular server performance degradation) :

      • java.net.SocketException: Too many open files (errno:24)
      • javax.xml.ws.soap.SOAPFaultException: Exception in checkForNIBDone:java.io.FileNotFoundException: /cla/jboss-4.2.3.GA/server/default/deploy/jbossws.sar/META-INF/standard-jaxws-client-config.xml (Too many open files (errno:24))

      [ClarityServiceManagement]] Servlet.service() for servlet ClarityServiceManagement threw exception
      java.io.FileNotFoundException: /WP03CLTA_cla/jboss-4.2.3.GA/server/default/data/wsdl/ClarityServiceManagement.ear/ClarityServiceManagement-war.war/ServiceManagementAPIService21983.wsdl (Too many open files (errno:24))


        Does each invocation of a webservice requires fetching of the WSDL document and then parsing it?
        Why are so many files opened(all the wsdl files) and not yet closed ?


        One more point : The threads are not killed even though they have processed the request and responded back. The currentThreads value is always equal to the maximum number of simultaneous requests, even though they might have finished processing. I think I read somewhere that its fixed in
        JBOss 5.0, can't remember where.

         

        Some threads are blocked on :

        java.util.Collections$SynchronizedMap.get(Collections.java:1979) even though the actual application code that we use do not have any Map's.
        java.util.zip.ZipFile.getEntry, couldn't make out where this is from.

         

        I'd be delighted if someone sheds some light on these points.

        I have googled for socketWrite error and got a lot of data, but it isn't helpful much.

          • 1. Re: JBoss Tomcat high number of threads and slower application response
            rohit.macherla

            Quoting Thomas Davis from JavaWorld.com (excerpt from : http://www.javaworld.com/javaworld/jw-11-1999/jw-11-ejb.html?page=1) :

             

            In the EJB implementations with which I have worked, all of the client-to-EJB communication within a single virtual machine instance takes place through a single socket connection. This is a bottleneck. When one client is talking to a bean, other clients (within the same virtual machine, sharing the single socket) are waiting for their turn. Waiting is bad. You want to avoid remote calls whenever possible.

             

            Our application has the similar thing going on. A single EJB given by our vendor (as a WebService) and many of our custom webservices using it. I wonder if there's any way out other than asking the vendor to provide local methods to that EJB.

             

             

            Cheers,

            Rohit M

            • 2. Re: JBoss Tomcat high number of threads and slower application response
              rohit.macherla

              Just found out that socket writes may also be blocked when the TCP Buffer is full (i.e., when the other end point isn't reading the data)

              I am not sure if I am following the right path..

              • 3. Re: JBoss Tomcat high number of threads and slower application response
                rohit.macherla

                I was analysing the thread dump and found this :

                Most of the threads that are BLOCKED are hung from stacktrace originating from :

                 

                .
                .
                javax.xml.bind.ContextFinder.newInstance(ContextFinder.java:211)
                javax.xml.bind.ContextFinder.find(ContextFinder.java:372)
                javax.xml.bind.JAXBContext.newInstance(JAXBContext.java:574)
                org.jboss.ws.core.jaxws.CustomizableJAXBContextFactory.createContext(CustomizableJAXBContextFactory.java:92)
                .
                .
                


                These threads are either executing the actual web-service method or are trying to perform a getPort() operation.

                My Java code for invoking the webservice is of the type :

                 

                public SomeResponse invoke(Payload payload)
                {
                  SOcreation.ServiceManagementAPIService SO_service = new SOcreation.ServiceManagementAPIService();
                  SOcreation.ServiceManagementAPI SO_port = SO_service.getServiceManagementAPIPort();
                  SO_port.callWebService(payload); //Sample
                }
                

                Is this a good practice ? Or should the service and port objects be created as Class-level (making them global and/or static too) ?

                Is there any way to escape the getPort() thing ? Why does the WSDL be read out each and every time the WebService is called ? Doesn't JBoss cache it ? In the "debug" log mode, I find that the entire WSDL is printed out (i.e., read from the file) and then parsed (XML parsing is costly).

                 

                (The synchornized Map method which I was discussing also arises out of the creation of JAXB context)

                 

                Any answers ?

                 

                Cheers

                Rohit M