Invitation and HOWTO - contributing/writing JMX tests
squirest Jan 29, 2002 8:32 AMHi all.
The JBoss JMX implementation is starting to amass a good number of tests, especially in the area of spec compliance. However, we desperately need more test coverage.
This is an invitation to contribute tests, something we've made really easy to do. You don't need to know the JMX spec inside and out, you just need write a bit of code that proves something works in the Sun JMX reference implementation and doesn't work in ours (or *does* work in ours, that's cool too).
This note explains how to do that, using our JUnit test suites.
The best way to ensure our JMX implementation works and keeps working is to test it to death. Hopefully I'm preaching to the choir about test-first coding but in case I'm not, if we can't test it, we don't know we're done.
If you're contributing to the core JMX code, lots of tests in support of your contribution makes everyone feel just swell.
If you've found a bug, prove it with a test (and it would be cool if you fixed it too). Personally, if I had to choose between the gift of a bugfix patch or a unit test I would rather have the test.
If you don't have CVS write access then you can upload a zip of your testcases to the sourceforge site as a patch.
I feel a bit like Lisa Simpson wailing "Grade me. GRADE ME!" but there you go.
We have three high-level test packages:
test.compliance.*: These tests are to ensure we comply with the spec, and where sensible, the reference implementation from Sun (or any other JMX implementation).
test.performance.*: These tests give us warm fuzzies that we're not pig-slow compared to other JMX implementations.
test.implementation.*: These tests can relate to classes specific to our JMX impl.
Of these three packages, only test.implementation.* can import/use classes and features specific to the JBoss JMX implementation. It should be possible to compile and run the other test packages using only the RI from Sun.
A note on test.compliance.*:
Tests in the compliance package are a trade-off. We're not going to be 100% compatible with Sun's RI because that would mean replicating bugs. However, comparing our results against the RI is a fantastic acceptance test.
As such, we already have a number of tests that fail when run against the RI, but pass against our impl.
In those cases all tests that are known to fail against the RI should be clearly marked in comments in the test code. If possible the error message of the assertion or failure should be prefixed with something like "FAILS IN RI".
Test Conventions:
All tests are expressed as JUnit TestCases, organised into a heirarchical set of TestSuites.
All TestCase names end in TEST, all TestSuite names end in SUITE. Each package has to contain a TestSuite which is added to the TestSuite in the parent package. TestCases are added to the TestSuite in the current package. There are plenty examples in the code of how this is done.
Classes which are not TestCases or TestSuites should go into a subordinate package called "support" if possible.
If your particular tests don't conceptually fit into an existing package then don't be tempted to re-use support classes from other packages; copy them instead. That way you're insulated from changes in unrelated TestCases.
However, don't get too bogged down with conventions. The most important thing is a test that proves something.
Table Driven TestSuites:
Obviously tests should excercise all execution paths but this is *extremely* important for the compliance tests. In some cases the permutations of tests can rapidly get out of hand if you're coding each one manually.
If that happens then consider a table-driven TestSuite, which is simply a TestSuite that dynamically instantiates a load of arbitrary TestCases.
test.compliance.objectname.MalformedSUITE is a good example. This suite runs through about 60 permutations of malformed values for ObjectName constructors. Changing it to a table-driven suite doubled the test count and caught the (IMHO) last bug.
Gotchas - Throwable and Error:
The JUnit assertXXX() methods and the fail() method will throw an AssertionFailedError. If you organise your code as below then you'll lose your assertion/failure context.
try
{
// do something
assertXXX();
// do more
if (something) fail();
}
catch (Throwable t)
{
fail("unexpected throwable: " + t.getMessage());
}
If you really want a catchall then the best thing is to catch and rethrow AssertionFailedErrors first.
Also, if you are assert()ing or fail()ing in a method called by a JMX agent then you'll need to catch RuntimeErrorException, unpack the Error and rethrow any AssertionFailedErrors.
That's pretty much it.
Trevor