We are developing an application using Sun j2sdk1.4.2_06,
JBOSS 3.2.6 and STRUTS 1.2.4.
Deploying the apprication we found the following Exception.
00:17:13,945 ERROR [MainDeployer] could not start deployment: file:/opt/java/apps-server/jboss-3.2.6/server/default/deploy/DietWeb.ear
org.jboss.deployment.DeploymentException: Error during deploy; - nested throwable: (ReflectionException: Cannot find setter method setDocBase null
Cause: java.lang.NoSuchMethodException: org.apache.commons.modeler.BaseModelMBean.setDocBase(java.lang.String))
at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
at $Proxy18.start(Unknown Source)
at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source)
at $Proxy35.start(Unknown Source)
at sun.reflect.GeneratedMethodAccessor14.invoke(Unknown Source)
at $Proxy8.deploy(Unknown Source)
Caused by: ReflectionException: Cannot find setter method setDocBase null
Cause: java.lang.NoSuchMethodException: org.apache.commons.modeler.BaseModelMBean.setDocBase(java.lang.String)
... 48 more
Caused by: java.lang.NoSuchMethodException: org.apache.commons.modeler.BaseModelMBean.setDocBase(java.lang.String)
... 53 more
No endorsed dir is present in our enviroment.
Could someone help us?
If you are using the embedded Tomcat 4.1.x with JBoss, edit the server/instance/deploy/jbossweb-tomcat.sar/WEB-INF/jboss-service.xml instead (where instance is the JBoss instance you are running - usually, default). The standard listener looks something like this:
Read more about listeners in the Tomcat documentation at http://jakarta.apache.org/tomcat. If you are truly going to have 500 people simultaneously accessing web pages, you would set maxProcessors to "500" at least. Remember to also increase the heap size for your JVM.
e.g. JAVA_OPTS = -Xmx128m %JAVA_OPTS% or whatever is appropriate.
Thks for yr reply, I've just applied your suggestion but I'm not seeing any improvements.
I developed a java program to startup 600 threads and do a simple http connect to an html page.
Only 266 of the 600 threads connect successfully and if I run it again, only 6 are accepted for example. How can I tell how many connections JBoss is handling at any one time. The third time I tried the following java exception was raised
java.net.ConnectException: Connection refused
What's happening here?
---------- jboss-service.xml under the jbossweb-tomcat.sar/META-INF directory -------------
I tried increasing the connection timeout to 10mins and still no luch. I tried increasing the heap size to 500Mb but I saw no improvements. How will I determine the optimal heap size to assign? It there any documentation on this???
Thks for yr support, I look forward to a reply.
I've noticed that I'm getting the following error
2003-06-06 11:48:06,225 ERROR [org.apache.tomcat.util.threads.ThreadPool] Caught exception executing org.apache.tomcat.u
til.net.TcpWorkerThread@9a99eb, terminating thread
I get this error after just 159 threads are opened. I increased the heap size to 1024m by setting a Unix environment variable JAVA_OPTS=-Xmx1024m
Is this correct? After that I start up JBoss. How can I check that this heap size is being assigned?
Your reply would be appreciated!
OK. Looks like most likely you are still suffering from a memory problem.
On the other settings, I would leave the timeout as standard. If you want speed on your connections and are willing to forego reverse lookups, set enableLookups to false. I'd tune back your maxProcessors to 350 for the time being and tune upwards as you test under load. Also dial back your max heap size to 500m so if things go crazy we don't thrash the system. Again we can tune upwards once we can see where we are going.
When you execute the JBoss run.sh script, pipe the results to file, if it is within an init script or otherwise capture the output.
You should be able to see this sort of thing:
JBoss Bootstrap Environment
JAVA_OPTS: -Dprogram.name=run.bat -Xms50m -Xmx120m
I usually set the JBoss specific JAVA_OPTS in the run.sh. So in this example, the last line is modified to include my special Jetty temp directory. The preceding part of the script exists in the run.sh, so it gives you an idea of where to add a JAVA_OPTS modification in your script.
# If JAVA_OPTS is not set try check for Hotspot
if [ "x$JAVA_OPTS" = "x" ]; then
# Check for SUN(tm) JVM w/ HotSpot support
if [ "x$HAS_HOTSPOT" = "x" ]; then
HAS_HOTSPOT=`$JAVA -version 2>&1 | $GREP -i HotSpot`
# Enable -server if we have Hotspot, unless we can't
if [ "x$HAS_HOTSPOT" != "x" ]; then
# MacOS does not support -server flag
if [ "$darwin" != "true" ]; then
# Setup JBoss specific properties
JAVA_OPTS="$JAVA_OPTS -Djava.io.tmpdir=/jetty -Dprogram.name=$PROGNAME"
Hope this helps.
Ok. Thks for a very interesting answer!
One more question
JAVA_OPTS=-Dprogram.name=run.bat -Xms50m -Xmx120m
I think that I know what I was doing wrong, I was not setting the program.name.
therefore in my case I will be including a line at the start of run.sh as follows
JAVA_OPTS=-Dprogram.name=run.sh -Xms100m -Xmx500m
Is this correct?
You don't really need to set your JAVA_OPTS customisation at the start of the shell script. The shell script already has a line that has the program name definition - so just modify that line.
e.g. Existing lines
# Setup JBoss sepecific properties
# Setup JBoss sepecific properties
JAVA_OPTS="$JAVA_OPTS -Xms100m -Xmx500m -Dprogram.name=$PROGNAME"
I should also explain about the reverse lookups. Every time a client browser connects to Tomcat, with reverse lookups enabled, the system does a DNS reverse lookup on the client IP address to determine the domain name. This takes time which holds up delivery of content. Turning off reverse lookups is a speed tuning tip.
Hope that is clear.
Yes very, thks so much!
Results of suggestions applied:
1. 100 'clients' connected successfully with http response code 200.
2. 200 'clients' connect but before any return with a response code I get the OutOfMemory error.
3. I increased the heap size to 2048m and still I get the OutOfMemory error if I try to connect more than 160 clients. Why????? There must be another variable that I'm missing!
How can I do the same configuration with Jboss 3.0.7 ?
I had another thought for testing. To see if the phenomena you are witnessing is purely connections based rather than related to servlet/EJB memory usage, try pointing your test harness at requesting static content at the same rate - for example, if you are using JMeter, try requesting a static index.html page from Tomcat using the same load characteristics.
Just a thought. On my small Linux development server, running Jetty, IBM SDK 1.4 with 512Mb of RAM, I am able to sustain 200 simultaneous connections with access to pages retrieving back-end data. That is without the channel listener. I'm using JMeter to simulate load.
Thks for yr feedback.
I am running the test on a Linux Red Hat 8.0 installation with 1GB RAM. I am using a small java program that I developed myself and I'm just trying to load a simple page which does access the back-end database. I will try to access a simple index.html instead and if that doesn't work, I'll try to use JMeter.
What does this mbean do?
i ended up scheduling my own gc task to keep memory at bay.
i run it every 60 seconds & it worked for my app.
however, you should read the hotspot documentation to see the pros & cons of doing so.
You should be able to test load more easily with the JMeter system.
The pooled invoker manages JBoss connections and threads (as opposed to connections and threads related to Tomcat/Jetty).
"This invoker pools Threads and client connections to one server socket. The purpose is to avoid a bunch of failings of RMI. 1. Avoid making a client socket connection with every invocation call. This is very expensive. Also on windows if too many clients try to connect at the same time, you get connection refused exceptions. This invoker/proxy combo alleviates this. 2. Avoid creating a thread per invocation. The client/server connection is preserved and attached to the same thread. So we have connection pooling on the server and client side, and thread pooling on the server side. Pool, is an LRU pool, so resources should be cleaned up."
I wouldn't play around with it too much.
Some things to know about Java:
I wouldn't tend to play with Xss as you can break JBoss with too low a value.
You may also want to actually monitor the number of threads in your JBoss JVM. The best way is to "pstree" it and see what you get as a thread count - see if you can capture the number of "processes" just before it breaks. The problem may be related to the underlying limitations of the kernel with respect to threads.
Check your maximum threads in RH 8.0 with "cat /proc/sys/kernel/threads-max"
Before you run tests, pstree and find out the number of threads in your JVM after JBoss has fully started. The number of threads when (or just before) you break it should give you some idea of what is happening. There should be one thread per Tomcat connection and then the rest are your JBoss processes.
As I said, it doesn't seem to be a memory issue here as you have a lot of memory - although you may want to monitor memory consumed by the JVM ("ps aux"). It may be an underlying issue or any interaction with the OS.
Hope this helps.