I am using JBoss 4.2.3 server on a HP UX OS (4 core CPU, 16GB RAM). Our application contains JAX-WS WebServices (developed using NetBeans 6.1), which invoke other vendor based WebServices for getting most of the work done. The vendor based WebServices are also JAX-WS and are EAR files with the WSDL being http://clarityhost:8080/ClarityServiceManagement-war/ServiceManagementAPIService?wsdl
The trouble is whenever a huge amount of requests flow in from external system in a short span of time (ex: 50 requests in 3 seconds), our server starts performing badly and wouldn't be able to take on extra load. All the later requests are piled up, the thread count increases to a very big number and the threads just hang in there (without getting killed). We have reported this to our vendor assuming there might be a table lock or something of that kind, but when we leave the application as it is without adding extra load, the number of busy threads come down slowly (after about 2 hrs) and hence rules away a deadlock scenario. In most of the cases, we had to restart our server so that the requests are processed again.
- I have collected many statistics (attached in this ticket) and here are some things that I do not understand :
Most of the threads are blocked on : java.net.SocketOutputStream.socketWrite0
If the application at the other end is not responding, how come write's are blocked ?
- The "Tomcat Status" document shows that the threads are waiting at :
"CLARITYHOST GET /ClarityServiceManagement-war/ServiceManagementAPIService?wsdl HTTP/1.1"
If the requests are made for invoking a webservice, why are they waiting on a GET method?
I am under the impression that the status would be more appropriate had it been something like:
"CLARITYHOST POST /ClarityServiceManagement-war/ServiceManagementAPIService HTTP/1.1"
Additionally, I am also under the impression that when the request is posted to a WebService, the tomcat status of the thread shouldn't contain the status element as : /ClarityServiceManagement-war/ServiceManagementAPIService?wsdl HTTP/1.1
but rather : /ClarityServiceManagement-war/ServiceManagementAPIService HTTP/1.1 (without the '?wsdl' part)
I now suspect if the threads are waiting to read the WSDL file and the vendor service may be actually innocent.
Sometimes we also notice the following errors during high load (apart from the regular server performance degradation) :
- java.net.SocketException: Too many open files (errno:24)
- javax.xml.ws.soap.SOAPFaultException: Exception in checkForNIBDone:java.io.FileNotFoundException: /cla/jboss-4.2.3.GA/server/default/deploy/jbossws.sar/META-INF/standard-jaxws-client-config.xml (Too many open files (errno:24))
[ClarityServiceManagement]] Servlet.service() for servlet ClarityServiceManagement threw exception
java.io.FileNotFoundException: /WP03CLTA_cla/jboss-4.2.3.GA/server/default/data/wsdl/ClarityServiceManagement.ear/ClarityServiceManagement-war.war/ServiceManagementAPIService21983.wsdl (Too many open files (errno:24))
Does each invocation of a webservice requires fetching of the WSDL document and then parsing it?
Why are so many files opened(all the wsdl files) and not yet closed ?
One more point : The threads are not killed even though they have processed the request and responded back. The currentThreads value is always equal to the maximum number of simultaneous requests, even though they might have finished processing. I think I read somewhere that its fixed in
JBOss 5.0, can't remember where.
Some threads are blocked on :
java.util.Collections$SynchronizedMap.get(Collections.java:1979) even though the actual application code that we use do not have any Map's.
java.util.zip.ZipFile.getEntry, couldn't make out where this is from.
I'd be delighted if someone sheds some light on these points.
I have googled for socketWrite error and got a lot of data, but it isn't helpful much.