Skip navigation

In this blog post I'd like to describe the way that I used recently when solving a problem with limiting allowed concurrent requests for JAX-WS service.

 

The problem

 

The application that I was developing together with my team was exhibiting an interface with operation that was expected to consume quite a lot of time. The client of this interface was expecting the response to arrive in some defined time or it was not interested in response. From the multitude of the clients, no client was issuing concurrent requests, always one at a time. For the realization of the interface we chose SOAP-based JAX-WS web service where the response was sent to the client synchronously. This means, that the client was waiting for the response or the request timed out. To not overwhelm server with long lasting requests, there was defined maximum number of requests that could be handled concurrently. All the requests over that number shall be rejected imediatelly, informing client to try again later.

 

The environment

 

The environment in which the solution was provided consisted of JBoss EAP 5 with JBoss Native WS. The requests were going through JBoss Web Server. In addition to this web service, there were other services based on HTTP and passing the same route. This disallowed me to use request thread limitation in JBoss Web Server as this would effectivelly lead to limitations on all web interfaces.

 

The solution design

 

The solution that I decided for was to use JAX-WS Handler Framework to intercept all requests and calculate actual throughput. The flow of messages was designed to be:

 

  1. Request comes to server
  2. Request is is passed to Throughput Limitation Handler that checks actual number of processed requests
    1. if it is lower than maximum, increases request counter and passes the request further to endpoint indicating non-blocked request
    2. if it is higher than maximum, increases request counter and passes the request further to endpoint indicating blocked request
  3. The enpoint receives request and checks the blocking indication
    1. if the request is blocked, it constructs failure message by throwing appropriate exception (Fault)
    2. if the request is not blocked, it is processed and
      1. response is returned
      2. failure message is constructed when expiration occurs (Fault)
  4. For both  success or failure, the response passes Throughput Limitation Handler on way back. It decreases the actual request counter.

 

The Throughput Limitation Handler does not constructs responses when limit is hit. The responsibility of this component is only to guard actual requests and indicate whether limit was reached. It is responsibility of the endpoint to construct appropriate response based on all factors that might influence it (they can be possibly other handlers guarding different aspects).

 

The solution implementation

 

The implementation of this solution is based on existing WSDL file that forms contract between clients and server. So the basic artifacts are generated by wsconsume utility. The artifacts that I need to add are:

  • Throughput Limitation Handler - guards number of requests processed
  • WS Endpoint (Limited Service Session) - processes the request and constructs responses.

 

I am going to present the solution on an artificial service called Limited Service that actually does nothing but transforms string input to all-upper-case.

 

Throughput Limitation Handler

 

The handler as noted above is implemented using JAX-WS handler framework. The (simplified) code for it is listed here. The code does not need much further explanations besides those in the code comments. To sumarize it:

  • The handler implements LogicalHandler user to install logical interceptor on the message processing route. It is logical, because it does not want to modify message contents.
  • In constructor, it initializes the actual request counter.
  • It implements 2 important methods
    • handleMessage - this method handles both incomming and outgoing messages when no error ocurs. First, it must be determined what direction given message has and do appropriate handling (counter increase/decrease, put indication into message context for application)
    • handleFault - this method handle only cases when error occurs. In this case, handleMessage is skipped for outgoing direction. The logic here only decreases the request counter.

 

Complete ThroughputLimitHandler source

 

public class ThroughputLimitHandler implements
        LogicalHandler<LogicalMessageContext> {

    /**
     * The name for parameter that indicates state of traffic limitation to web
     * service endpoint.
     */
    public static final String LIMIT_REACHED = "LIMIT_REACHED";

    ...

    /** The actual maximum parallel requests. */
    private int maxParallelRequests = DEFAULT_MAX_PARALLEL_REQUESTS;

    /** The actual parallel requests. */
    private AtomicInteger actualParallelRequests;

    /** The lock that guards changes to actual parallel requests. */
    private final Lock lock;

    /**
     * Instantiates a new throughput limit handler.
     */
    public ThroughputLimitHandler() {
        actualParallelRequests = new AtomicInteger(0);
        lock = new ReentrantLock();
        ...
    }

    @Override
    public boolean handleMessage(LogicalMessageContext context) {
        // Get the direction of the message
        Boolean outbound = (Boolean) context
                .get(LogicalMessageContext.MESSAGE_OUTBOUND_PROPERTY);

        if (outbound) {
            // Response, decrease counter only
            try {
                lock.lock(); // block until condition holds
                int actual = actualParallelRequests.decrementAndGet();
                ...
            } finally {
                lock.unlock();
            }
        } else {
            // Request, do checks
            lock.lock(); // block until condition holds
            try {
                if (actualParallelRequests.get() < maxParallelRequests) {
                    // We could pass the request further for processing
                    context.put(LIMIT_REACHED, Boolean.FALSE);
                    context.setScope(LIMIT_REACHED, Scope.APPLICATION);

                } else {
                    context.put(LIMIT_REACHED, Boolean.TRUE);
                    context.setScope(LIMIT_REACHED, Scope.APPLICATION);
                }

                // Increment counter, it must be known how many requests are
                // actually handled
                actualParallelRequests.incrementAndGet();

            } finally {
                lock.unlock();
            }
        }
        return true;
    }

    @Override
    public boolean handleFault(LogicalMessageContext context) {
        // Failure response, decrease counter only
        try {
            lock.lock(); // block until condition holds
            int actual = actualParallelRequests.decrementAndGet();
        } finally {
            lock.unlock();
        }
        return true;
    }
}

 

In the complete code that you can download (see Takeaway) there is more than what is listed here. In addition to this, there is added internal marking of requests to see which requests were accepted or rejected and also some logging to see what is happening. It also contains configuration for maximum throughput using system parameter named request.limit, otherwise default maximum is used.

 

Limited Service Session

 

The implementation of the endpoint is quite straightforward. Lets see the code first.

 

Complete LimitedServiceSession source

 

@WebService(name = "LimitedService", ...)
@HandlerChain(file = "resource://META-INF/jaxws-handlers.xml")
@Stateless
public class LimitedServiceSession implements LimitedService {

    /** Web service context. */
    @Resource
    WebServiceContext wsCtx;
 
    public String limitedServiceRequest(String parameter) {
        Boolean limit = (Boolean) wsCtx.getMessageContext().get(
                ThroughputLimitHandler.LIMIT_REACHED);

        if (limit) {
            return null;
        }

        String upperCase = parameter.toUpperCase();
        return upperCase;
    }

}

 

So, here we have web service endpoint. Note important annotation @HandlerChain that defines what handlers shall be used to intercept message targeted to here. Apparently, it contains reference to Throughput Limitation Handler. I'll show the file later. Then, it contains the only method defined by WSDL which first uses web service contect to ask for LIMIT_REACHED parameter. Based on that it decides what to do next. If it is true, then it ends the processing. If it is false, it goes for possibly long lasting processing and returns result of it. The expiration of the processing is not illustrated here as it is not important for the traffic limitation now.

 

Handlers configuration

 

You saw above, that the endpoint referenced configuration file name jaxws-handlers.xml. The contents of this file is following:

 

<?xml version="1.0" encoding="UTF-8"?>
<jws:handler-chains xmlns:jws="http://java.sun.com/xml/ns/javaee">

    <jws:handler-chain name="HandlerChain">
        <jws:protocol-bindings>##SOAP11_HTTP</jws:protocol-bindings>
        <jws:handler>
            <jws:handler-name>Throughput Limit Handler</jws:handler-name>
            <jws:handler-class>martinhynar.blog.ThroughputLimitHandler
            </jws:handler-class>
        </jws:handler>
    </jws:handler-chain>

</jws:handler-chains>

 

It defines handler chain for SOAP binding where our Throughput Limitation Handler is mentioned. Web service provider processes this file and passes all requests through it when Limited Service is contacted.

 

Other artifacts in the companion code

 

In addition to this traffic limitation implementation, there is also implementation of the client provided in the attached source codes. The client has form of JBoss bean and is published also as MBean which can be accessed using JMX Console. It contains single method to invoke it and send given amount of requests to Limited Service endpoint. The messages are of course sent in parallel to see traffic limitation effect.

 

Takeaway

 

In this post, complete codes were not listed, but you can still get them. You can checkout complete codes as mavenized Eclipse project here. The built example project is verified to work on JBoss AS 6.1.0 with CXF web service stack. (This is little different than I was mentioning in original environment description, but as you can see in the code, the implementation does not use any JBoss web service extensions except for WebContext annotation to bind endpoint to given context URL). After deploying the application, you can find client MBean in JMX Console under name martinhynar.blog:service=LimitedServiceClient.

In this blog post I would like to highlight and solve problem that I recently faced in my project. Together with my team, we are building web application (using JSF 1.2, RichFaces 3.3 and Seam 2.2) that lists data from database in various forms. So far, nothing extraordinary. One of the ways the application is listing its data is using Extended Data Table (EDT) for data that we want to filter, sort and paginate. The data displayed in EDT have generally these characteristics:


  • there are lots of records (thousands)
  • it is using Hibernate ORM to map between Java and DB worlds
  • the entities are using eager fetching for some relations (not all of properties are necessarily displayed in table)
  • common use case is that user does not need all records but searches desired one using filtering

 

The problems that we naturally faced when EDT was used were:


  • basic EDT model (list of objects) loads all entities in memory and then calculates pagination
  • as it loads all entities (with all relations), memory footprint is significant
  • filtering and sorting is done in memory
  • you load lots of data that you don't need

 

This facts forced me to do some investigation about what customizations EDT provide to make its functioning bit more effective. I found couple of articles that focused on basic Data Table but when I tried to use it, it was not functioning as I desired. So, next are the artifacts that formed my solution that have following characteristics:


  • application loads only the data that are immediately displayed in EDT
  • filtering and sorting is done in database
  • when row is selected in the table, additional information that is necessary (for processing or displaying) is loaded
  • for pagination, count of rows that fulfil filtering criteria is retrieved from DB not using resultList.size()

 

With this requirements I started to dig possible solution for my problem. Armed with good tip from StackOverflow (link in references) I reused the code from there and added my own improvements that further fulfilled my goals. So the solution consists of 4 parts:

 

  • the page that contains EDT
  • action component that handles actions requested on the page
  • data model component that handles operations with EDT
  • data provider component that handles requests for data for EDT

 

In this post, I will demonstrate the problem on a table listing Person object that contains some information about a person. This means, that components used contain this Person keyword in their names.

 

The page with EDT

 

The page itself is very simple and the EDT usage is no different than any other EDT usage. The snippet with the most important parts and little explanation is following.

 

[Complete page source | https://martin-hynar-blog.googlecode.com/svn/trunk/richfaces-extendeddatatable/extendeddatatable/src/main/webapp/index.xhtml]

 

<rich:extendedDataTable
  value="#{EDTAction.personsDataModel}"
  selection="#{EDTAction.personSelection}"
  binding="#{personsTable}"
  ...>

 

As you probably know, value attribute references a model from which the table takes data. In the simple and most straightforward form, this is List of objects. This attribute is the most important from the perspective of the goals that were set above. The selection attribute then references a property to which information about selected row(s) is stored and binding references property that holds state information of the table. I use the properties referenced by selection and binding to manipulate selected data in action bean which is EDTAction in this case.

 

Action bean - EDTAction

 

The action bean is here just to glue all the components together. So, without deeper description, I list important code snippet from EDTAction.

 

[Complete EDTAction source | https://martin-hynar-blog.googlecode.com/svn/trunk/richfaces-extendeddatatable/extendeddatatable/src/main/java/org/jboss/edt/EDTAction.java]

 

@Name("EDTAction")
public class EDTAction {

    @In(required=false)
    @Out(required=false)
    private HtmlExtendedDataTable personsTable;

    private SimpleSelection personSelection;

    @In(value = "PersonDataModel", create = true)
    private PersonDataModel personsDataModel;
    // In simple scenarios, list can be used without affecting page
    // private List<Person> personsDataModel;

    private Person selectedPerson;

    ...
}

 

The properties here are the ones referenced from EDT in the page. The crucial information here is that the model for EDT is not simple List, but PersonDataModel component. See bellow.

 

EDT data model - PersonDataModel

 

The component is one of the two components that are important for fulfilling the requirements stated. This component handles the requests from EDT and holds the data actually displayed in it. How it looks like ...

 

 

[Complete PersonDataModel source | https://martin-hynar-blog.googlecode.com/svn/trunk/richfaces-extendeddatatable/extendeddatatable/src/main/java/org/jboss/edt/PersonDataModel.java]

 

@Name("PersonDataModel")
@Scope(ScopeType.SESSION)
public class PersonDataModel
    extends ExtendedDataModel
    implements Serializable, Modifiable {
    ...
}

 

It extends ExtendedDataModel from RichFaces API that is default model used for EDT. If simple list is used as data container in EDTAction, then RichfFaces will use ExtendedDataModel behind the scenes and all data will be in the list. This is not desired, so here, the default data model is extended and as described bellow, uses own store for data. Also, Modifiable interface is implemented which allows to modify the model based on the filtering and sorting conditions entered in EDT (see later).

 

So, what is inside PersonDataModel ...

 

public class PersonDataModel
    extends ExtendedDataModel
    implements Serializable, Modifiable {

    @In(value = "PersonDataProvider", create = true)
    private DataProvider<Person> dataProvider;

    private Object rowKey;
    private List<Object> wrappedKeys;
    private Map<Object, Person> wrappedData;
    private Integer rowCount;
    private Integer rowIndex;
    private SequenceRange memoRange;

 

First and very important property of the model component is dataProvider. The reason for this is separation of concerns and following functionality division from RichFaces. PersonDataModel is not responsible for retrieving data from data source but keeping data of EDT. Then we have rowKey which denotes record key for which EDT will ask data; wrappedKeys is collection of keys for actually displayed records; wrappedData is mapping between key and actual data; rowCount is total number of rows that are available and drives pagination; rowIndex contains index of currently selected row. The property memoRange caches range that was displayed last time.

 

These properties are used in methods inherited from ExtendedDataTable and some are passed to dataProvider to control what is retrieved from DB. The important methods are:

 

  • walk - computes the range of records to be displayed and calls back EDT with the key of the record to be rendered (visitor pattern used here). Note also usage of loadData method in case when wrappedKeys is null. This method uses dataProvider to load data for actual view.

 

    public void walk(FacesContext context, DataVisitor visitor, Range range, Object argument) throws IOException {

        int rowC = getRowCount();
        int firstRow = ((SequenceRange) range).getFirstRow();
        int numberOfRows = ((SequenceRange) range).getRows();
        if (numberOfRows <= 0) {
            numberOfRows = rowC;
        }

        if (wrappedKeys != null && memoRange != null
                && memoRange.getFirstRow() == firstRow
                && memoRange.getRows() == numberOfRows) {
            Object key;
            for (Iterator<Object> it = wrappedKeys.iterator(); it.hasNext(); visitor.process(context, key, argument)) {

                key = it.next();
                setRowKey(key);
            }

        } else {
            reset();
            wrappedKeys = new ArrayList<Object>();
            int endRow = firstRow + numberOfRows;
            if (endRow > rowC) {
                endRow = rowC;
            }

            Object key;
            for (Iterator<Person> it = loadData(firstRow, endRow).iterator(); it.hasNext(); visitor.process(context, key, argument)) {

                Person item = it.next();
                key = getKey(item);
                wrappedKeys.add(key);
                wrappedData.put(key, item);
            }

        }
        memoRange = (SequenceRange) range;
    }

 

  • getRowCount - provides number of available records for calculating pagination. If the number is known (this means that nothing changed between rerendering of EDT) then it is returned without asking DB (assuming no data were added/removed).

 

    public int getRowCount() {
        if (rowCount == null) {
            rowCount = new Integer(dataProvider.getRowCount());
        } else {
            return rowCount.intValue();
        }

        return rowCount.intValue();
    }

 

  • getObjectByKey - gets actual record from wrappedData or asks dataProvider to load this object from database.

 

    public Person getObjectByKey(Object key) {
        Person t = wrappedData.get(key);
        if (t == null) {
            t = dataProvider.getItemByKey(key);
            wrappedData.put(key, t);
        }
        return t;
    }

 

  • modify - method inherited from Modifiable that is called by EDT when user enters some filter of asks for sorting of data. This method passes the conditions to dataProvider and resets PersonDataModel (which means that all state information is invalidated because filtering or sorting will cause different records to be displayed). The complete method (not listed here) must contain logic for comparing current and new filters whether some change occurred. If the filters are unchanged, data cached in this component could be preserved, otherwise, new data must be retrieved.

 

    public void modify(List<FilterField> filterFields, List<SortField2> sortFields) {
        if (dataProvider instanceof Sortable2) {
            ((Sortable2) dataProvider).setSortFields(sortFields);
        }

        if (dataProvider instanceof Filterable) {
            ((Filterable) dataProvider).setFilterFields(filterFields);
        }

        // Logic that compares current
        // and new filters if there is a change.
        // If the filters are intact, reset is not called
        reset();
    }

 

Feeding data to the model - PersonDataProvider

 

The last piece into the puzzle is the PersonDataProvider component that is responsible for getting relevant records from DB. The class signature is following:

 

[ Complete PersonDataProvider source | https://martin-hynar-blog.googlecode.com/svn/trunk/richfaces-extendeddatatable/extendeddatatable/src/main/java/org/jboss/edt/PersonDataProvider.java]

 

@Name("PersonDataProvider")
@Scope(ScopeType.SESSION)
public class PersonDataProvider
    implements DataProvider<Person>, Sortable2, Filterable {
    ...
}

 

The class implements 3 interfaces from which DataProvider defines basic contract used by PersonDataModel for asking record range, individual records by key, key of the record and count of available records. Interfaces Sortable2 and Filterable are used to pass sorting and filtering constraints.

 

The implementation of PersonDataProvider is quite self-descriptive so I won't spend much space with it. It works with several facts provided from PersonDataModel

 

  • range of records that is going to be displayed in cooperation with pagination - reflected in getItemsByRange(int firstRow, int numberOfRows)
  • filters for particular entity propeties - passed in setFilterFields(List<FilterField> filterFields)
  • sort conditions - passed in setSortFields(List<SortField2> sortFields)

 

As you can see in the code, to employ dynamic query building I used Criteria API. Using this API and the conditions mention, you can end up with generated query like this:

 

select
  top ? this_.id as id7_0_,
  this_.city as city7_0_,
  this_.email as email7_0_,
  this_.first as first7_0_,
  this_.last as last7_0_,
  this_.street as street7_0_,
  this_.zip as zip7_0_ 
from
  Person this_ 
order by
  this_.id asc

 

Few tips

 

  • In data provider, you can control laziness of the relations using Criteria PI and load only necessary properties for EDT rendering.

 

Criteria criteria = ((org.hibernate.Session) entityManager.getDelegate()).createCriteria(Person.class);
criteria.setFetchMode("kids", FetchMode.SELECT);

 

  • If you work with more complex data where building queries with Criteria API becomes complicated (e.g. filtering/sorting on properties from related tables) it is easier to come with permutation of predefined HQL queries. This allows also using constructs that are not possible in Criteria API (e.g. DB based SELECT UNIQUE ...)

 

Takeaway

 

In this post, complete codes were not listed, but you can still get them via links above the listings. If you are interested in trying a simple application and listing some data in EDT customized using this solution, you can checkout anonymously the Mavenized Eclipse project from https://martin-hynar-blog.googlecode.com/svn/trunk/richfaces-extendeddatatable/extendeddatatable

 

I have verified the solution on JBoss AS 6.1.0.Final and you can reach the person listing on http://localhost:8080/edt

 

The configuration of the persistence in the WAR is that SQL commands are printed into server log. You can see there that only limited data are retrieved from DB, not all records. For instant use, example records are auto-inserted. Note also that the application used DefaultDS and Hibernate is configure with create-drop option.

 

References

Recently I came accross JBPM and wanted to start with some small application. It was easy to download latest JBPM package, install it and import examples into Eclipse to check the stuff. However, I am more to Maven so I wanted to create the application based with this setup. The following paragraphs are the way how to build mavenized standalone application that makes use of JBPM.

 

Step 1: Make simple Maven project

First of all, create simple maven application:

 

mvn -DgroupId=com.example -DartifactId=jbpm-standalone -Dversion=1.0 archetype:generate

 

Select archetype named maven-archetype-quickstart and confirm creation. By this you'll get basic maven project. Now, import the project as the existing Maven project into Eclipse and see what's in. Pretty simple and boring so far

 

 

Step 2: Update dependencies

To make use of JBPM in this maven project, we need to import dependencies that are necessary for the application to compile and run. I finished with the bellow listed POM.

 

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.example</groupId>
    <artifactId>jbpm-standalone</artifactId>
    <version>1.0</version>
    <packaging>jar</packaging>

    <name>jbpm-standalone</name>
    <url>http://jboss.org</url>

    <properties>
        <repository.jboss.org.url>http://repository.jboss.org/maven2</repository.jboss.org.url>
        <repository.maven2.dev.java.net.url>http://download.java.net/maven/2/</repository.maven2.dev.java.net.url>
        <repository.ibiblio.url>http://www.ibiblio.org/maven2</repository.ibiblio.url>
        <repository.jboss.org.nexus.url>https://repository.jboss.org/nexus/content/groups/public-jboss</repository.jboss.org.nexus.url>

        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>

        <jdk.debug>true</jdk.debug>
        <jdk.optimize>false</jdk.optimize>
        <jdk.source>1.6</jdk.source>
        <jdk.target>1.6</jdk.target>

        <version.jbpm>4.4</version.jbpm>
        <version.xerces>2.9.1</version.xerces>
        <version.javax.mail>1.4.1</version.javax.mail>
        <version.slf4j>1.6.1</version.slf4j>
        <version.testng>5.14</version.testng>
        <version.javassist>3.4.GA</version.javassist>
    </properties>

    <dependencies>
        <dependency>
            <groupId>org.jbpm.jbpm4</groupId>
            <artifactId>jbpm-api</artifactId>
            <version>${version.jbpm}</version>
        </dependency>
        <dependency>
            <groupId>org.jbpm.jbpm4</groupId>
            <artifactId>jbpm-jpdl</artifactId>
            <version>${version.jbpm}</version>
        </dependency>
        <dependency>
            <groupId>org.jbpm.jbpm4</groupId>
            <artifactId>jbpm-pvm</artifactId>
            <version>${version.jbpm}</version>
        </dependency>
        <dependency>
            <groupId>org.jbpm.jbpm4</groupId>
            <artifactId>jbpm-log</artifactId>
            <version>${version.jbpm}</version>
        </dependency>
        <dependency>
            <groupId>org.jbpm.jbpm4</groupId>
            <artifactId>jbpm-db</artifactId>
            <version>${version.jbpm}</version>
        </dependency>
        <dependency>
            <groupId>xerces</groupId>
            <artifactId>xercesImpl</artifactId>
            <version>${version.xerces}</version>
        </dependency>
        <dependency>
            <groupId>javax.mail</groupId>
            <artifactId>mail</artifactId>
            <version>${version.javax.mail}</version>
        </dependency>
        <dependency>
            <groupId>org.testng</groupId>
            <artifactId>testng</artifactId>
            <version>${version.testng}</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-api</artifactId>
            <version>${version.slf4j}</version>
        </dependency>
        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-jdk14</artifactId>
            <version>${version.slf4j}</version>
        </dependency>
        <dependency>
            <groupId>javassist</groupId>
            <artifactId>javassist</artifactId>
            <version>${version.javassist}</version>
        </dependency>
    </dependencies>
    <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>2.3.1</version>
                <configuration>
                    <source>${jdk.source}</source>
                    <target>${jdk.target}</target>
                    <encoding>utf-8</encoding>
                    <debug>${jdk.debug}</debug>
                    <optimize>${jdk.optimize}</optimize>
                </configuration>
            </plugin>
            <plugin>
                <groupId>org.codehaus.mojo</groupId>
                <artifactId>exec-maven-plugin</artifactId>
                <version>1.1</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>java</goal>
                        </goals>
                    </execution>
                </executions>
                <configuration>
                    <mainClass>com.example.App</mainClass>
                </configuration>
            </plugin>
        </plugins>
    </build>
    <repositories>
        <repository>
            <id>repository.jboss.org</id>
            <name>JBoss Repository</name>
            <url>${repository.jboss.org.url}</url>
            <layout>default</layout>
        </repository>
        <repository>
            <id>repository.maven2.dev.java.net</id>
            <name>Java.net Repository for Maven</name>
            <url>${repository.maven2.dev.java.net.url}</url>
            <layout>default</layout>
        </repository>
        <repository>
            <id>repository.ibiblio</id>
            <name>IBiblio Maven Repository</name>
            <url>${repository.ibiblio.url}</url>
            <layout>default</layout>
        </repository>
    </repositories>
</project> 

 

Step 3: Define simple workflow

For JBPM to run, you must have some process to execute. It is not any super extensive super complex process, but is just sufficient for the starter app. Bellow are the steps of the process:

  1. Starts ...
  2. Greets ...
  3. Waits in intermediate state ...
  4. Prints that it is ending ...
  5. Ends.

 

... and the the process file itself ...

 

 

<?xml version="1.0" encoding="UTF-8"?> 
<process name="Activity" xmlns="http://jbpm.org/4.4/jpdl"> 
    <start g="27,51,80,40">
        <transition g="-18,4" name="greet" to="say hello" />
    </start>
 
    <java class="com.example.OutputActivity" g="141,49,87,50" method="say"
        name="say hello">
        <arg>
            <string value="Hello!" />
        </arg>
        <transition g="-28,4" name="wait for end" to="proceed to end" />
    </java>

    <state g="312,47,92,52" name="proceed to end">
        <transition g="-23,4" name="to ending" to="ending" />
    </state>

    <java class="com.example.OutputActivity" g="465,48,92,52" method="say"
        name="ending">
        <arg>
            <string value="Bye!" />
        </arg>
        <transition g="-19,4" name="to end" to="end" />
    </java>

    <end g="619,51,80,40" name="end" />

</process>

 

The class that is executed by the process is shipped in the archive attached to this post. It is very simple...

 

Step 4: Configuration files

 

Each JBPM-enabled application needs specific configuration file in the classpath. By default, the name is jbpm.cfg.xml, and we will keep this name as well. The goal of this configuration file is to set up various services that JBPM makes use of during execution. The details of this configuration are described in JBPM configuration. Here we focus only on setting up the basic app. So, the configuration file, that gets placed into src/main/resources looks like this:

 

<?xml version="1.0" encoding="UTF-8"?>
<jbpm-configuration>
 
 <import resource="jbpm.default.cfg.xml" />
 <import resource="jbpm.tx.hibernate.cfg.xml" />
 <import resource="jbpm.jpdl.cfg.xml" />
 <import resource="jbpm.identity.cfg.xml" />
 
</jbpm-configuration>

 

In addition, as JBPM makes use of Hibernate to offload data about processes into database, so the hibernate configuration is necessary as well. Here, we stick to the simplest possible configuration - put data into HSQLDB. For this we need file jbpm.hibernate.cfg.xml in the same location mentioned above. The contents is the following:

 

<?xml version="1.0" encoding="utf-8"?>
<!DOCTYPE hibernate-configuration PUBLIC
          "-//Hibernate/Hibernate Configuration DTD 3.0//EN"
          "http://hibernate.sourceforge.net/hibernate-configuration-3.0.dtd">

<hibernate-configuration>
  <session-factory>
  
     <property name="hibernate.dialect">org.hibernate.dialect.HSQLDialect</property>
     <property name="hibernate.connection.driver_class">org.hsqldb.jdbcDriver</property>
     <property name="hibernate.connection.url">jdbc:hsqldb:mem:.</property>
     <property name="hibernate.connection.username">sa</property>
     <property name="hibernate.connection.password"></property>
     <property name="hibernate.hbm2ddl.auto">create-drop</property>
     <property name="hibernate.format_sql">true</property>
     
     <mapping resource="jbpm.repository.hbm.xml" />
     <mapping resource="jbpm.execution.hbm.xml" />
     <mapping resource="jbpm.history.hbm.xml" />
     <mapping resource="jbpm.task.hbm.xml" />
     <mapping resource="jbpm.identity.hbm.xml" />
     
  </session-factory>
</hibernate-configuration>

 

Step 5: The code

The last step is to setup process engine, command it to execute and go through states. This is done in couple simple steps, that are listed and explained in the code snippet bellow:

 

package com.example;
 
import org.jbpm.api.Configuration;
import org.jbpm.api.Execution;
import org.jbpm.api.ExecutionService;
import org.jbpm.api.ProcessEngine;
import org.jbpm.api.ProcessInstance;
import org.jbpm.api.RepositoryService;
 
public class App {
 
    public static void main(String[] args) {
        /*
         * Create process engine from configuration. Here the configuration
         * object represents default jbpm.cfg.xml file.
         */
        ProcessEngine processEngine = Configuration.getProcessEngine();
 
        /*
         * Repository service provides access to resources that are known to
         * JBPM, including loaded processes.
         */
        RepositoryService repositoryService = processEngine.getRepositoryService();
 
        /*
         * Execution service is the handle to execute state transitions in
         * process engine.
         */
        ExecutionService executionService = processEngine.getExecutionService();
 
        /*
         * Now query repository service for the process that we have created and
         * deploy it.
         */
        repositoryService.createDeployment().addResourceFromClasspath(
                "process.jpdl.xml").deploy();
 
        /*
         * Start the process instance with the name of the process. By starting
         * process, new ProcessInstance object is created. In our example flow,
         * after the initial state, there is 'java' state that gets executed and
         * the transition continues to next 'state' where it waits to be
         * commanded.
         */
        ProcessInstance processInstance = executionService
                .startProcessInstanceByKey("Activity");
 
        /*
         * We are now in 'state' named 'proceed to end'. We must ask process
         * instance to find active execution in this particular state.
         */
        Execution execution = processInstance
                .findActiveExecutionIn("proceed to end");
 
        /*
         * Finally we ask execution service to signal that transition from the
         * state shall occur. By this, the rest of the process is executed and
         * the flow ends.
         */
        processInstance = executionService.signalExecutionById(execution
                .getId());
    }
} 

 

Step 6: Package and execute

At this point, it is possible to compile the whole application and execute it. You can do that by running following command (it both build the app and executes it). Note that some of the files were not listed here, but are included in the takeaway archive bellow this post..

 

mvn clean package exec:java

 

exec:java is execution of the exec-maven-plugin that runs com.example.App class. If you change the name of the main class, you shall change also the configuration of the plugin in pom.xml.

 

After executing it, you shall see lots of listing of various messages with text Hello! and Bye! at the very end. If you see that, you've succeeded!

 

Takeaway

Attached to this post, there is archive with project just described. It is ready Eclipse project that you just need to execute. In addition, there is pair of launchers, one that build the application and one that executes it. You will find them in the "Run configurations".

Recently, I had used http://en.wikipedia.org/wiki/Composite_pattern in one of my projects. That pattern was used on model objects that had to be persisted in database. And, not suprisingly, JPA with Hibernate as a persistence provider was used. The problem does not make any headache in its simple form, but might become complex ... (as everything in real life). Nevertheless, I decided to publish my solution to help others and give sort of baseline ...

 


Background

To refresh memory with Composite pattern: We have 3 classes (I say classes as it is not possible to map interfaces so far). First serves as a common base (interface) and gives common business methods. Second is a node in the composite structure that could contain other nodes. Third is a leaf that only implements business methods.

 

Persistable implementation

In the solution, there are couple of classes, let's start with AbstractComposite that represents common base. The code is listed bellow:

 

@Entity(name = AbstractComposite.TABLE_NAME)
@Inheritance(strategy = InheritanceType.SINGLE_TABLE)
@DiscriminatorColumn(name = "compositeType", discriminatorType = DiscriminatorType.STRING)
@NamedQueries(value = { @NamedQuery(name = "GetCompositeById", query = "from "
        + AbstractComposite.TABLE_NAME + " where id = :id") })
public abstract class AbstractComposite implements Serializable {

    private static final long serialVersionUID = 7139007918101901734L;
    public static final String TABLE_NAME = "Composite_Table";

    @Id
    @GeneratedValue
    @TableGenerator(name = "Id_Generator", table = "Sequence_Table", pkColumnName = "name", valueColumnName = "count", pkColumnValue = "Composite_Sequence")
    private Long id;

    public AbstractComposite() {
    }

    public void setId(Long id) {
        this.id = id;
    }

    public Long getId() {
        return id;
    }

    // Whatever business operations come here
}

 

As the Composite structure indicates, we must cope with inheritance hierarchy. In this case, I have chosen SINGLE_TABLE because I have only small number of leaf types. In fact, each leaf type add at least one column to the table. If you are going to have rather extensive leaf types, then joining will avoid very very sparse single table.

 

Then, there is also defined DiscriminatorColumn and its type. Each leaf is the supposed to define unique DiscriminatorValue.

 

So far, so good. Let us check leaf implementation. I have created leaf that holds String value, called CompositeString. The code is listed bellow:

 

@Entity
@DiscriminatorValue("string")
public class CompositeString extends AbstractComposite {
    private static final long serialVersionUID = -7787631697361429875L;

    @Column(name = "stringValue")
    private String value;

    public CompositeString() {
    }

    public CompositeString(String value) {
        this.value = value;
    }

    public String getValue() {
        return value;
    }

    public void setValue(String value) {
        this.value = value;
    }

    @Override
    public String toString() {
        return String.format("CompositeString [value = %s]", value);
    }
}

 

As I mentioned above, DiscriminatorValue defines value that will distinguish this composite type from the others. In addition, I have defined attribute value which has defined ut database name to stringValue. By redefinition of the database name using ColumnName annotation, you can have same attribute in all leaf types (check in the attached archive).

 

The last but probably the most complex is the intermediate node that must hold references to other contained nodes or leaves. Let's check the code:

 

@Entity()
@DiscriminatorValue("multi")
public class CompositeMultiValue extends AbstractComposite {

    private static final long serialVersionUID = -4324727276077386614L;

    @OneToMany(cascade = { CascadeType.ALL }, fetch = FetchType.EAGER)
    @JoinTable(name = "Composite_Multi_Values_Table", joinColumns = @JoinColumn(name = "compositeId"), inverseJoinColumns = @JoinColumn(name = "multiValueId"))
    private Set<AbstractComposite> values;

    public CompositeMultiValue() {
    }

    public CompositeMultiValue(Set<AbstractComposite> values) {
        this.values = values;
    }

    public void setValues(Set<AbstractComposite> values) {
        this.values = values;
    }

    public Set<AbstractComposite> getValues() {
        return values;
    }

    @Override
    public String toString() {
        return String.format("ComparableMultiValue [values=%s]", values);
    }
}

 

This entity again defines value that discriminates it from others. Then, it contains set of contained nodes and leaves. These are linked to the actual "multivalue" using reference table with two columns, both holding a identifier from Composite_Table. First identifier is parent (compositeId) and the second is child (multiValueId). In this particular case, the cascading is set to ALL and fetching is EAGER, but in specific cases, this might be changed appropriatelly.

 

Note on storage of enums

By default (at least in Oracle), enum values are stored as RAW values. You can observe this if you change database connection parameters to Oracle in src/test/resources/META-INF/persistence.xml. The result type for column enumValue is RAW. If you desire to store your enums as String values, you need to follow guidelines in here: http://community.jboss.org/wiki/Java5EnumUserType.

Takeaway

The codes used in this post are shipped in the attached archive as mavenized Eclipse project. There is also simple test that demonstrates usage of the composite structure.Of interrest might be after running the test also location target/hibernate3/sql/schema.ddl that contains DDL scripts for creating the database structures. Enjoy!

This post describes a way to authenticate and authorize users in EAR application. This is nothing special but there could be some problems with respect to the requirements in place. So the tasks that shall be solved are the following:

 

  • User can access system via WUI by logging in.
  • Other applications can access system via WebService. Account used for accessing must be in specific role.
  • User credentials are in database.
  • Seam (version 2) is used.
  • Custom security domain is defined.
  • Authentication and authorization is covered by JAAS

 


Custom security domain definition

To define new security domain, there is a dedicated configuration file in JBoss, located in $JBOSS_HOME/server/$SERVER_PROFILE/conf/login-config.xml. It is possible to define new security domain there or in separate configuration file that gets then shipped together with EAR. We use here the second alternative. So the configuration file, named e.g. custom-login-config-beans.xml is then packed in EAR's META-INF folder.

 

<?xml version="1.0" encoding="UTF-8"?>
<deployment xmlns="urn:jboss:bean-deployer:2.0">
    <application-policy xmlns="urn:jboss:security-beans:1.0" name="CustomSecurityDomain">
         <authentication>
            <login-module code="com.example.CustomAuthModule"
                flag="required">
                <module-option name="entityManagerFactoryJndiName">java:/EntityManagerFactory</module-option>
                <module-option name="unauthenticatedIdentity">anonymous</module-option>
                <module-option name="hashAlgorithm">md5</module-option>
            </login-module>
        </authentication>
    </application-policy>
</deployment>

 

This configuration file defines new security domain - CustomSecurityDomain. It declares the class that provides facilities for this security domain - com.example.CustomAuthModule. It declares JNDI name of the EntityManagerFactory that provides persistence facilty used to get credentials. It declares identity of unauthenticated user (some resources might be accessible also for unauthenticated users). And it declares hash algorithm used to disable access to plain passwords.

 

The com.example.CustomAuthModule itself

To implement custom authentication module that could access user credentials in dozens of exotic places, we need a class that inherits from org.jboss.security.auth.spi.AbstractServerLoginModule. In this particular case, as we have credentials in database, we will inherit from org.jboss.security.auth.spi.UsernamePasswordLoginModule which is more appropriate. The code of CustomAuthModule is the following:

 

 

package com.example;

import java.security.Principal;
import java.security.acl.Group;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
 
import javax.naming.InitialContext;
import javax.naming.NamingException;
import javax.persistence.EntityManager;
import javax.persistence.EntityManagerFactory;
import javax.security.auth.Subject;
import javax.security.auth.callback.CallbackHandler;
import javax.security.auth.login.LoginException;

import com.example.UserAccount;
import com.example.UserRole;

import org.jboss.security.SimpleGroup;
import org.jboss.security.auth.spi.UsernamePasswordLoginModule;

public class AuthModule extends UsernamePasswordLoginModule { 
    private EntityManager entityManager;
    private static final String EMF_JNDI_CONFIG_KEY = "entityManagerFactoryJndiName";

    /**
     * Initialize this LoginModule.
     */
    public void initialize(Subject subject, CallbackHandler callbackHandler, Map<String, ?> sharedState, Map<String, ?> options) {
        super.initialize(subject, callbackHandler, sharedState, options);
        InitialContext ctx;
        try {
            ctx = new InitialContext();
            // Get the name of EMF JNDI
            String jndiEntityManagerFactory = options.get(EMF_JNDI_CONFIG_KEY).toString();
            EntityManagerFactory factory = null;
            factory = (EntityManagerFactory) ctx.lookup(jndiEntityManagerFactory);
            entityManager = factory.createEntityManager();
        } catch (NamingException e) {
            e.printStackTrace();
        }
    }

    @Override
    protected String getUsersPassword() throws LoginException {
        String username = getIdentity().getName();
        UserAccount user = (UserAccount) entityManager.createQuery("from UserAccount where username = :username").setParameter("username", username)
                .getSingleResult();
        return user.getPasswordHash();
    }

    @Override
    protected Group[] getRoleSets() throws LoginException {
        HashMap<String, Group> groupMap = new HashMap<String, Group>();
 
        String username = getIdentity().getName();
        UserAccount user = (UserAccount) entityManager.createQuery("from UserAccount where username = :username").setParameter("username", username)
                .getSingleResult();
 
        String defaultGroupName = "Roles";
        Group defaultGroup = new SimpleGroup(defaultGroupName);
        setsMap.put(defaultGroupName, defaultGroup);

        List<UserRole> roles = user.getRoles();
        for (UserRole userRole : roles) {
            Principal p;
            try {
                p = createIdentity(userRole.getRolename());
                defaultGroup.addMember(p);
            } catch (Exception e) {
                e.printStackTrace();
            }
        }
 
        Group[] roleSets = new Group[setsMap.size()];
        groupMap.values().toArray(roleSets);
        return roleSets;
    }
}

 

We need to override three methods

  • initialize - which initiates the authentication module to the JAAS infrastructure and populates reference to the entity manager via entity manager factory declared in configuration. Note, that to make use of entity manager you have to provide appropriate configuration that is not part of this post.
  • getUsersPassword - retrieves user password from database using name that is available via getIdentity method.
  • getRoleSets - gets roles of the user. This example considers only one level of roles (meaning that role cannot contain group of other roles).

 

The classes UserAccount and UserRole are simple entity classes that consist of rolename in case of UserRole and username, passwordHash and list of type UserRole in case of UserAccount.

Enable new authentication module in Seam

To allow Seam to make use of above defined custom authentication module, we need to make certain configuration in components.xml that is shippen in WEB module of the EAR. The necessary configuration is the following.

 

 

<?xml version="1.0" encoding="UTF-8"?>
<components xmlns="http://jboss.com/products/seam/components"
    xmlns:security="http://jboss.com/products/seam/security"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://jboss.com/products/seam/components http://jboss.com/products/seam/components-2.2.xsd http://jboss.com/products/seam/security http://jboss.com/products/seam/security-2.2.xsd">
    <!-- other configuration -->
    <security:identity jaas-config-name="CustomSecurityDomain" />
</components>

 

This informs Seam to use JAAS as an authentication mechanism and names security domain that shall be used for this purpose.

Login page

The login page is very simple and uses embedded Seam Identity. The code is self-explanatory.

 

<h:form id="loginForm">  
    <div>
        <h:outputLabel for="username">Username</h:outputLabel>
        <h:inputText id="username" value="#{credentials.username}"/>
    </div>
    <div>
        <h:outputLabel for="password">Password</h:outputLabel>
        <h:inputSecret id="password" value="#{credentials.password}"/>
    </div>
    <div>
        <h:commandButton id="submit" value="Login" action="#{identity.login}"/>
    </div>
</h:form>

 

To refresh memory. #{credentials.username} is EL reference to Seam component named credentials and its property username. (The same applies to #{credentials.password}). #{identity.login} is EL execution of method login on Seam built-in component identity.

 

WebService

Finally, we wanted to have a WebService that will share the same authentication mechanism as WUI. With help of annotations, this is quite straightforward and no extensive configuration is needed. Here is example WebService that will make use of CustomSecurityDomain and will allow to execute methods only users from group external. The code again does not require any closer explanation as the annotations are more than self-descriptive.

 

 

package com.example;

import javax.jws.WebService;
import javax.jws.soap.SOAPBinding;

import org.jboss.ejb3.annotation.SecurityDomain;
import org.jboss.wsf.spi.annotation.WebContext;

@WebContext(contextRoot = "/webservices", urlPattern = "/TestService", secureWSDLAccess = false)
@SecurityDomain(value = "CustomSecurityDomain")
@RolesAllowed("external")
@SOAPBinding(style = SOAPBinding.Style.DOCUMENT, use = SOAPBinding.Use.LITERAL, parameterStyle = SOAPBinding.ParameterStyle.WRAPPED)
@WebService(name = "TestService", targetNamespace = "urn:com:example", endpointInterface = "com.example.TestService")
@Stateless
public class TestServiceSession implements TestService {
    // WebMethods from TestService interface that gets generated from WSDL
}

 

By this you are able to reuse authentication and authorization facility in both web user interface and also in backend webservices. By providing different implementation of security domain, you can change the source of your user credentials.