4 Replies Latest reply on Jun 10, 2008 3:14 AM by mvaldes

    Performance and Validation

      Our validation team has been working for a couple of weeks on setting up a test and validation infrastructure for both XPDL and BPEL extensions. This infrastructure has allow us stabilize latest releases candidates of Nova Bonita (XPDL) and Nova Orchestra (BPEL) projects on top of the PVM.

      I would like to do the same work on the PVM before release the 1.0 version so in this post I'€™ll describe our tests infrastructure, hardware and software configuration as well as the nature of those tests. Please let me know what do you think...

      Objectives:

      1.- To obtain detailed information about the project behaviour and system resources consumption (CPU, network, GC, memory...) for a particular load injection. From a particular configuration we will push the projects to their limits in terms of number of simultaneous users in parallel.

      2.- To check project stability over time while supporting a constant charge during a long period of time

      Software and hardware configuration

      1.- The Load Injection Framework we used is CLIF (http://clif.ow2.org/). Clif is a framework written in Java providing performance tests capabilities over any system type. Main features of Clif are:
      - deployment, remote management and monitoring capabilities in a distributed environment (both injectors and probes)
      - collection of data related to resources consumption

      2.- Hardware configuration is the following: 3 servers + 1 PC
      - Clif charge injector/s: 2 CPU Intel Xeon, 3GHz with 8GB RAM running on RedHat Linux AS
      - Project server (either Nova Bonita or Nova Orchestra): 4 CPU Intel Xeon, 4GHz with 8GB RAM running on Debian and both BEA and SUN jdk 1.5
      - Database: 4 CPU IA-64/Itanium 2, 1,5 GHz with 6GB RAM running on Debian and Postgres 8.3.0-1, Oracle 10.2.0.3 and MySQL 5.0.51a-3 databases
      - Clif supervision console: PC

      A Clif injector is a java based class (called plugin) that generates sequence of queries with a defined think time in between. While, if and conditional statements can also be added.

      HTTP, RMI, DNS, JDBC, TCP/IP, LDAP… protocols can be users from within the Clif injectors.

      Test scenarios

      Our validation team has developed to Clif injectors (one for Nova Bonita and the other for Nova Orchestra):

      - Nova Bonita injector leverages the java API (accessible either as POJOs or as Session Beans). After the deploy of a workflow process, each Clif scenario (virtual user) will be composed by the following operations: workflow instance creation, getTodo queries and manual activities execution. A scenario represents a complete creation and execution of a workflow instance by the same user. Think times can be added in between each one of those operations.

      - In Nova Orchestra, the injector is using the http protocol to call a web services that will create an instance of a BPEL process.

      Different XPDL and BPEL processes are currently used in those scenarios.

      Charge Profile

      We have basically defined two different kind of tests:
      - Heavy load tests: short time tests scenarios (aka 1 h.) in which the load is increased each defined period of time
      - Stability and robustness tests: long time tests scenarios where the injector maintains this load during some hours.

      Measurements

      - CPU consumption for the 3 servers: injector, project and database
      - Query execution average time
      - Queries/second
      - 1 query is consider as a complete execution of a XPDL/BPEL instance
      - Status of other resources such hard disks, network
      - JVM behaviour through the GC analysis and % of memory used -> that allows us to easily find memory leaks.
      - DB related data: top 10 of db queries: talking more time and more often executed…

      best regards,
      Miguel Valdes

        • 1. Re: Performance and Validation
          tom.baeyens

          awsome !

          the examples that i'm working out now are ideal candidates for testing the pvm.

          we should group all persistent usages in the examples and see how we can address them from clif injectors.

          what do you need from me to get this going ?

          • 2. Re: Performance and Validation

            first Pascal and Charles (Guillaume will join next week) will take a look to the last PVM updates...

            We will also try to update our extensions to those changes and check if there is any issue/bug appearing at that time (and let you know for sure :-)

            After that, I suggest to define together (conf call) the main scenarios and we will create the nex clif scenarios.

            regards,
            Miguel Valdes

            • 3. Re: Performance and Validation
              tom.baeyens

              some last refinements in the api-internal split are still pending.

              one that i still have to do is move the EnvironmentFactory to internal

              i'll see if i can work my way to a new PVM release by beginning of next week

              • 4. Re: Performance and Validation

                Ok, perfect.

                let m know when you are done.

                best regards,
                Miguel Valdes