JSFUnit is LGPL. I believe all the dependencies, including Cactus and HtmlUnit, are Apache 2.
Seam is LGPL, but it is a compile-time (but not runtime) dependency.
Are you shipping JSFUnit as part of a product?
yes, we are shipping jsfunit as part of an application. We abuse it for automatically processing uploaded files.
These files hold the information that the user would have had to enter when using the normal interactive worfklow.
It actually works really fine, and is quite abstract too.
When an information is invalid we get the messages object is being set. The user then sees these messages on the status screen in the end.
I'm totally blown away. :-)
Can you share more details? I've never thought of JSFUnit as a post-processing tool.
So what exactly happens?
I'll guess from what you said. It sounds like there is a file upload and then JSFUnit is somehow kicked off (via ServletRunner?). Then the file is processed with another request generated from JSFClientSession, then there is an assert to make sure everything processed correctly. Is that it?
So JSFUnit is used as a bot against your own application?
We created a 'JsfUnitNavigation' Testclass. This file will read an object from the context which holds exact information on "labelid" or "fieldid" and its values, aswell as what page to start with and what page to end with. Also it has information on which value bindings have to be provided as return values and which bindings have to be set automatically (as they'd be hard to set via UI).
This test class is called from the normal UI workflow, after the file has been uploaded. WIthin the file we can have several "tasks" which will be executed one after another.
If we stay at same page and can't process anything for a few rounds we break up. We also skip if the popup is always there and not disabled anymore. Also we skip if there's a h:message displayed in the automated flow. These are the kind of messages that then will be displayed in the end after all tasks have been executed in the automated jsf flow.
In order to execute the flow we don't call the URL of the servlet runner directly but rather create the test object and run it in the java code.
As mentioned above, our 'asserts' are certain aspects - like investigating workflow (did we stall) or if there was error messages displayed.
In the 'config' file that's stored in the context.. we dont have any workflow programmed directly.
Here's the logic on how we proceed:
Execute all value bindings that have to be saved in the new jsf session.
Run over all "input" information of the config file. The key of these can be the bundle key, the id of the input element or ah.. i forgot bout the last option :p... if there was a match on the current page, we set the value for the element. Then we check the available buttons on the page and check if they match one of the allowed button ids that we have in the config file.
We also can set that for certain page ids we want to have a special button to be pressed.
That's how we can proceed within the workflow quite easily - as jsfunit behaves like a normal human.
For the select-one things we used the value binding in the background. The good thing is that there'S validators that still check if the value was accepted or not... so if it was invalid there'd be a message displayed :)
So yes, you're totally right: jsfunit acts as a bot, which will automatically process 'batch files'. This can be an improvement of speed to process these tasks.
Of course it's all protected too - so that the "batch" workflow is protected by a cipher and all that :-)
Without the help of JSF Unit we'd probably have had to implement the whole workflow again - just automated - and woudl have had to check if the values are ok manually.
And the good thing is - as you can imagine by the above explanations - the automation test class is quite generic. So we may be able to re-use it for another application if required.
Sounds like a cool use case for JSFUnit.
The funny thing is, that I have been asked about 4 weeks ago, whether one of our test tools could be used for data-migration (export data from old db and using the new UI and the test tools enter the data into the new system). Unfortunately I first have to work out the nasty SSL-stuff (all kinds of certificates and nasty stores like Entrust...) So I had to decline at that moment...
Anyway: another success story for a good tool
SSL can be nasty. True. We use SSL (we read the username/password and keystore info from system properties, that are being set there by websphere), aswell as create an SSO cookie (using a provided webservice) that is being injected before the first request to the server.
But it all works like a charm :-)
By the way, there'd be some ways how to "autotrust" pretty much all certificates, but this'd be server-wise - and therefore a security risk. Had that in my code too at first to avoid the SSL stuff.