-
1. Re: Monitoring Thread CPU usage
bill.burke Mar 5, 2004 1:31 PM (in response to atait)run within a profiler like OptimizeIt. OptimizeIt shows cpu usage per thread.
-
2. Re: Monitoring Thread CPU usage
atait Mar 5, 2004 1:55 PM (in response to atait)Thanks Bill. This is a production system, mission critical to the business. Also, this occurs infrequently and have not been able to duplicate in a test environment. Will a profiler affect performance? That's my concern with this. I have no experience, though. If not, I'll give this a try. I just saw a tool called jmemprof in another thread. Is this a reasonable option, again the performance concerns.
Thanks -
3. Re: Monitoring Thread CPU usage
bill.burke Mar 5, 2004 2:19 PM (in response to atait)yes profiler will affect performance. OptimizeIt is pretty decent though. Hobbled my bench by about 25-35%, but I was able to find bottlenecks.
Ya know, it could be the garbage collector running. -
4. Re: Monitoring Thread CPU usage
roshan Mar 5, 2004 2:24 PM (in response to atait)You may have done this already - Take a thread dump of the parent process. This may help identify rogue threads.
Thanks,
RS -
5. Re: Monitoring Thread CPU usage
atait Mar 5, 2004 3:26 PM (in response to atait)Thanks a good suggestion.
I have a thread dump using kill -QUIT. Is there any information on what I should be seeing for Jboss? I have looked through this but don' t have experience with intrepreting the data. Any suggestions for reading etc. to educate myself? Other tips? -
6. Re: Monitoring Thread CPU usage
atait Mar 8, 2004 10:08 AM (in response to atait)Need some additional help. I have combed through thread dumps from a time period when the Jboss was using %80 of CPU and then some after I restarted, right after and a few days after. I found one peculiar thread. Before describing further, there's an issue one user is having that seems to happen around the same time as this CPU issue. I'm suspicious that it may be causing this somehow. On occasion, when this user is executing a step that uploads a file via HTTP content, the server side upload process seems to hang on the last chunk of the file. This is not a chunked interface. The entire file is uploaded at one time. On the server, the data (file) is read in chunks. The process seems to hang on the last chunk of bytes to be read. I'm suspicious that the HTTP content is either corrupted somehow in transit or by processing on the client. First, I hope to determine if this could be our problem. Next, the thread dump: The following output was found in all three thread dumps that I grabbed while the system was slow. They are a few hours apart. This output does not appear in any that I have grabbed over a period of days since restarting Jboss.
Any feedback is appreciated
Allen
"TP-Processor25" daemon prio=5 tid=0xc19998 nid=0x77 runnable [dc77f000..dc781994]
at org.apache.log4j.Category.isDebugEnabled(Category.java:742)
at org.apache.commons.logging.impl.Log4JLogger.isDebugEnabled(Log4JLogger.java:224)
at org.apache.jk.server.JkCoyoteHandler.doRead(JkCoyoteHandler.java:273)
at org.apache.coyote.Request.doRead(Request.java:457)
at org.apache.coyote.tomcat4.CoyoteInputStream.readBytes(CoyoteInputStream.java:197)
at org.apache.coyote.tomcat4.CoyoteInputStream.read(CoyoteInputStream.java:154)
at com.idexx.servlet.IdexxHttpUploadServlet.readBytesIntoBuffer(IdexxHttpUploadServlet.java:85)
at com.idexx.econsult.servlet.ECGUploadServlet.doPost(ECGUploadServlet.java:80)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:760)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:247)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:193)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:256)
at org.apache.catalina.core.StandardPipeline$StandardPipelineValveContext.invokeNext(StandardPipeline.java:643)
at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:480)
at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:995)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)
at org.apache.catalina.core.StandardPipeline$StandardPipelineValveContext.invokeNext(StandardPipeline.java:643)
at org.jboss.web.tomcat.security.JBossSecurityMgrRealm.invoke(JBossSecurityMgrRealm.java:220)
at org.apache.catalina.core.StandardPipeline$StandardPipelineValveContext.invokeNext(StandardPipeline.java:641)
at org.apache.catalina.valves.CertificatesValve.invoke(CertificatesValve.java:246)
at org.apache.catalina.core.StandardPipeline$StandardPipelineValveContext.invokeNext(StandardPipeline.java:641)
at org.jboss.web.tomcat.tc4.statistics.ContainerStatsValve.invoke(ContainerStatsValve.java:76)
at org.apache.catalina.core.StandardPipeline$StandardPipelineValveContext.invokeNext(StandardPipeline.java:641)
at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:480)
at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:995)
at org.apache.catalina.core.StandardContext.invoke(StandardContext.java:2417)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:180)
at org.apache.catalina.core.StandardPipeline$StandardPipelineValveContext.invokeNext(StandardPipeline.java:643)
at org.apache.catalina.valves.ErrorDispatcherValve.invoke(ErrorDispatcherValve.java:171)
at org.apache.catalina.core.StandardPipeline$StandardPipelineValveContext.invokeNext(StandardPipeline.java:641)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:172)
at org.apache.catalina.core.StandardPipeline$StandardPipelineValveContext.invokeNext(StandardPipeline.java:641)
at org.jboss.web.tomcat.security.SecurityAssociationValve.invoke(SecurityAssociationValve.java:65)
at org.apache.catalina.core.StandardPipeline$StandardPipelineValveContext.invokeNext(StandardPipeline.java:641)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:577)
at org.apache.catalina.core.StandardPipeline$StandardPipelineValveContext.invokeNext(StandardPipeline.java:641)
at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:480)
at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:995)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:174)
at org.apache.catalina.core.StandardPipeline$StandardPipelineValveContext.invokeNext(StandardPipeline.java:643)
at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:480)
at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:995)
at org.apache.coyote.tomcat4.CoyoteAdapter.service(CoyoteAdapter.java:197)
at org.apache.jk.server.JkCoyoteHandler.invoke(JkCoyoteHandler.java:309)
at org.apache.jk.common.HandlerRequest.invoke(HandlerRequest.java:387)
at org.apache.jk.common.ChannelSocket.invoke(ChannelSocket.java:673)
at org.apache.jk.common.ChannelSocket.processConnection(ChannelSocket.java:615)
at org.apache.jk.common.SocketConnection.runIt(ChannelSocket.java:786)
at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:677)
at java.lang.Thread.run(Thread.java:536) -
7. Re: Monitoring Thread CPU usage
camel Mar 8, 2004 7:27 PM (in response to atait)Just a guess, but -- it looks like the Coyote connector may be doing a lot of logging. I've seen it log pretty verbosely to the console before (like a full hex dump of every byte POSTed!). Perhaps it's logging every byte of your uploaded file? Check for big log files and try setting your logging categories to INFO or higher.
-
8. Re: Monitoring Thread CPU usage
atait Mar 10, 2004 10:06 AM (in response to atait)An update on this: First, thanks for the feedback to date. I have associated a specific incident with the increase in CPU usage by Jboss. Some of our HTTP requests contain a file that is read into memory and then saved on the server's file system. In some cases, it seems this file data is corrupt or size doesn't match content size, or??? and the InputStream.read() call hangs and never returns. The application timeout eventually kills the application thread but some low level processing continues to run eating up resources. At this point, a restart is the only solution we have found to remove the orphaned processing. I am considering some sort of timeout mechanism when read() 'ing. For example, if the timeout is executed, I'll try InputStream.close().
Any light that can be shed on this is appreciated.
Allen -
9. Re: Monitoring Thread CPU usage
bill.burke Mar 10, 2004 10:16 AM (in response to atait)still sounds like a application problem....
You might want to try processing this file within a transaction. The JBoss transaction manager will interrupt the tx's thread on a tx timeout which will interrupt any blocking I/O.
or, you could just implement this timeout/interruption code in a separate service. -
10. Re: Monitoring Thread CPU usage
bill.burke Mar 10, 2004 10:19 AM (in response to atait)This InputStream.read() is being done by Tomcat? Then strike my last suggestion....
FYI, a blocked InputStream.read() does not eat CPU. You really should run your application through optimizeit. Thread dumps just ain't gonna do it. -
11. Re: Monitoring Thread CPU usage
atait Mar 22, 2004 1:39 PM (in response to atait)These topics often don't have an outcome so wanted to take the time to briefly describe my findings. Until rencently we were on Tomcat 3.2.1. Apparently there was a bug in the read() method regarding it's return value. There was some code in our application to deal specifically with that bug. Problem was, when we upgraded to Jboss 3.2.3 the Tomcat bug was fixed. That legacy code resulted in an infinite loop under certain circumstances. Thus the increasing CPU over time as this path of execution was hit.
Thanks again Bill for your feedback. -
12. Re: Monitoring Thread CPU usage
brucebgordon Apr 8, 2004 9:23 AM (in response to atait)Would a test case have revealed the problem during development?