The JBoss JMX implementation is starting to amass a good number of tests, especially in the area of spec compliance. However, we desperately need more test coverage.
This is an invitation to contribute tests, something we've made really easy to do. You don't need to know the JMX spec inside and out, you just need write a bit of code that proves something works in the Sun JMX reference implementation and doesn't work in ours (or *does* work in ours, that's cool too).
This note explains how to do that, using our JUnit test suites.
The best way to ensure our JMX implementation works and keeps working is to test it to death. Hopefully I'm preaching to the choir about test-first coding but in case I'm not, if we can't test it, we don't know we're done.
If you're contributing to the core JMX code, lots of tests in support of your contribution makes everyone feel just swell.
If you've found a bug, prove it with a test (and it would be cool if you fixed it too). Personally, if I had to choose between the gift of a bugfix patch or a unit test I would rather have the test.
If you don't have CVS write access then you can upload a zip of your testcases to the sourceforge site as a patch.
I feel a bit like Lisa Simpson wailing "Grade me. GRADE ME!" but there you go.
We have three high-level test packages:
test.compliance.*: These tests are to ensure we comply with the spec, and where sensible, the reference implementation from Sun (or any other JMX implementation).
test.performance.*: These tests give us warm fuzzies that we're not pig-slow compared to other JMX implementations.
test.implementation.*: These tests can relate to classes specific to our JMX impl.
Of these three packages, only test.implementation.* can import/use classes and features specific to the JBoss JMX implementation. It should be possible to compile and run the other test packages using only the RI from Sun.
A note on test.compliance.*:
Tests in the compliance package are a trade-off. We're not going to be 100% compatible with Sun's RI because that would mean replicating bugs. However, comparing our results against the RI is a fantastic acceptance test.
As such, we already have a number of tests that fail when run against the RI, but pass against our impl.
In those cases all tests that are known to fail against the RI should be clearly marked in comments in the test code. If possible the error message of the assertion or failure should be prefixed with something like "FAILS IN RI".
All tests are expressed as JUnit TestCases, organised into a heirarchical set of TestSuites.
All TestCase names end in TEST, all TestSuite names end in SUITE. Each package has to contain a TestSuite which is added to the TestSuite in the parent package. TestCases are added to the TestSuite in the current package. There are plenty examples in the code of how this is done.
Classes which are not TestCases or TestSuites should go into a subordinate package called "support" if possible.
If your particular tests don't conceptually fit into an existing package then don't be tempted to re-use support classes from other packages; copy them instead. That way you're insulated from changes in unrelated TestCases.
However, don't get too bogged down with conventions. The most important thing is a test that proves something.
Table Driven TestSuites:
Obviously tests should excercise all execution paths but this is *extremely* important for the compliance tests. In some cases the permutations of tests can rapidly get out of hand if you're coding each one manually.
If that happens then consider a table-driven TestSuite, which is simply a TestSuite that dynamically instantiates a load of arbitrary TestCases.
test.compliance.objectname.MalformedSUITE is a good example. This suite runs through about 60 permutations of malformed values for ObjectName constructors. Changing it to a table-driven suite doubled the test count and caught the (IMHO) last bug.
Gotchas - Throwable and Error:
The JUnit assertXXX() methods and the fail() method will throw an AssertionFailedError. If you organise your code as below then you'll lose your assertion/failure context.
// do something
// do more
if (something) fail();
catch (Throwable t)
fail("unexpected throwable: " + t.getMessage());
If you really want a catchall then the best thing is to catch and rethrow AssertionFailedErrors first.
Also, if you are assert()ing or fail()ing in a method called by a JMX agent then you'll need to catch RuntimeErrorException, unpack the Error and rethrow any AssertionFailedErrors.
That's pretty much it.
Looks like I broke about 50% of the rules
Grade F :-)
> Gotchas - Throwable and Error:
> The JUnit assertXXX() methods and the fail() method
> will throw an AssertionFailedError. If you organise
> your code as below then you'll lose your
> assertion/failure context.
> // do something
> // do more
> if (something) fail();
> catch (Throwable t)
> fail("unexpected throwable: " + t.getMessage());
> If you really want a catchall then the best thing is
> to catch and rethrow AssertionFailedErrors first.
Does this solve something that's been annoying me: when I get a stack trace of a test that has failed it is missing the line number of the source file (which I find very valuable)...?
Or is that a result of something else (JUnit classes compiled with -O flag? the use of reflection?)... anyone know how I can get the source line instead of "Unknown Source" when an exception stack trace is printed?
> Does this solve something that's been annoying me:
> when I get a stack trace of a test that has failed it
> is missing the line number of the source file (which
> I find very valuable)...?
Nope. Lost line numbers are an optimized compile. Either check the value for javac.optimize in the ANT build or your particular IDE's flavour of compiler settings.
Following David Jencks suggestion I started moving
the JBossMX tests over to the testsuite module.
In fact it is mostly done, it works pretty well.
1) I'm compiling and running the whole testsuite over
JBossMX, this avoids complicating the classpath
2) The JMXRI compliance test still has JBossMX underneath,
again this is to avoid complicated classpaths and also
there are still references to org.jboss.mx in there.
3) I'm not sure of the correct way to do the Standard
tests that do the test generation.
Also I tried booting JBoss using JBossMX. There's only
two problems I could see.
SingleJBoss doesn't register (I'm using the Sunday's version)
One of the system classes tries to parseTrace from
If you can't wait until this weekend when I commit it,
just ask me and I'll send you what I got.
Wow! I didn't expect that...
Um... obviously there are no tests that check the jboss MBeanProxy is working properly... Remember how I said about calling server.invoke() on getters and setters is illegal and no longer supported? I'll bet the proxy doesn't call server.set/getAttribute() does it?
As for class packaging into jars I was going to make it so that the jbossmx build generated 4 jars: core, compliancesuite, performancesuite, implsuite. That way we can be 100% confident that the tests are running against the right classes.
Then I was going to have the test target depend on the jars rather than just the compile.
Can the main testsuite cope with that?
I've already got the separation in the testsuite :-)
So core will just involve removing the existing tests
from the jboss-jmx.jar
The classpath problem is that everyting is compiled in
one shot, and the same classpath is used to run the
tests. I can fix it, but involves a bit of messing
Also, I'm using Sunday's version before your capability
> Nope. Lost line numbers are an optimized compile.
> Either check the value for javac.optimize in the ANT
> build or your particular IDE's flavour of compiler
ok, need to recompile ant then. will try it. Thanks.
> I said about calling server.invoke() on getters and
> setters is illegal and no longer supported? I'll bet
> the proxy doesn't call server.set/getAttribute() does
No it doesn't.
There's currently a better behaving version in org.jboss.mx.util.MBeanProxy.
> > Nope. Lost line numbers are an optimized compile.
> > Either check the value for javac.optimize in the
> > build or your particular IDE's flavour of compiler
> > settings.
> ok, need to recompile ant then. will try it. Thanks.
Huh? recompile ANT? I meant set the property javac.optimize to "no" or "false" (can't remember which) in the build.xml for jbossmx.
I've finished migrating the JBossMX tests to the
I've "fixed" the classpath problem. For the first time
the JMXRI compliance tests run without JBossMX in the
I'll commit it once tonight's automated testsuite has
started, I'm not sure when it takes the snapshot? I
don't want to commit during that.
The testsuite is very strange at the moment.
On my windows machine I get 27 failures and 5 errors
linux gives 24 failures and 127 errors which agrees
with the automated tests at lubega. Has somebody
added some windows specific code?
1) Explain how I did it
2) Fix the monitor/timer tests using Trevor's suggestion
3) Remove tests from the jmx tree
Ok, Here's what I've done to get the JBossMX tests
into the Automated testsuite.
1) Added the jmx build to the main build. This included
the javadocs. There is no install, yet...
I had to make sure jboss-jmx.jar exists after a normal
2) Ported the test library to
this mostly involves changing package names.
Except for: I've create a TestCase class for each of our
tests. The main point of this is to override
the JBossTestCase to not check the JBoss server is
running. We don't need it for our tests.
This is also the place where constants are configured,
there are no *SUITE.java
This is the worst bit. I've separated the compile
classpath from the run classpath. We need to compile
over jboss-jmx.jar but run over jmxri.jar until I can
prove running over jboss-jmx.jar doesn't break anything.
Of course the jbossmx tests run over jboss-jmx.jar
4) Built our tests
This builds 3 jars, jbossmx-compliance.jar,
jbossmx-implementation.jar and jbossmx-performance.jar
NOTE: The testsuite does not run over these, it runs
over the output/classes directory.
5) Added our tests as targets.
6) Main targets
The compliance and implementation tests are run in the
The performance test is run in the
7) Helper targets
tests-jbossmx-all runs all 3 tests over jboss-jmx.jar
tests-jmx-complaince runs the compliance test over
jboss-jmx.jar and jmxri.jar
Now we have a nightly test of JBossMX :-)
Point 3 will be simpler once JBoss is running over