www.ironjacamar.orgCommunity Documentation

Chapter 5. Testing

Table of Contents

5.1. Overall goals
5.1.1. Specification
5.1.2. IronJacamar specific interfaces
5.1.3. IronJacamar specific implementation
5.2. Testing principle and style
5.2.1. Integration Tests
5.2.2. Unit Tests
5.3. Quality Assurance
5.3.1. Checkstyle
5.3.2. Findbugs
5.3.3. JaCoCo
5.3.4. Tattletale
5.4. Performance testing
5.4.1. JProfiler
5.4.2. OProfile
5.4.3. Performance test suite

The overall goals of our test environment is to execute tests that ensures that we have full coverage of the JCA specification as well as our implementation.

The full test suite is executed using

ant test
    

A single test case can be executed using

ant -Dmodule=embedded -Dtest=org.jboss.jca.embedded.unit.ShrinkWrapTestCase one-test
    

where -Dmodule specifies which module to execute the test case in. This parameter defaults to core. The -Dtest parameter specifies the test case itself.

You can also execute all test cases of a single module using

ant -Dmodule=embedded module-test
    

where -Dmodule specifies which module to execute the test cases in. This parameter defaults to core.

The build script does not fail in case of test errors or failure.

You can control the behavior by using the junit.haltonerror and junit.haltonfailure properties in the main build.xml file. Default value for both is no.

You can of course change them statically in the build.xml file or temporary using -Djunit.haltonerror=yes. There are other jnuit.* properties defined in the main build.xml that can be controlled in the same way.

Our tests follows the Behavior Driven Development (BDD) technique. In BDD you focus on specifying the behaviors of a class and write code (tests) that verify that behavior.

You may be thinking that BDD sounds awfully similar to Test Driven Development (TDD). In some ways they are similar: they both encourage writing the tests first and to provide full coverage of the code. However, TDD doesn't really provide a guide on which kind of tests you should be writing.

BDD provides you with guidance on how to do testing by focusing on what the behavior of a class is supposed to be. We introduce BDD to our testing environment by extending the standard JUnit 4.x test framework with BDD capabilities using assertion and mocking frameworks.

The BDD tests should

We are using two different kind of tests:

  • Integration Tests: The goal of these test cases is to validate the whole process of deployment, and interacting with a sub-system by simulating a critical condition.

  • Unit Tests: The goal of these test cases is to stress test some internal behaviour by mocking classes to perfectly reproduce conditions to test.

The integration tests simulate a real condition using a particular deployment artifacts packaged as resource adapters.

The resource adapters are created using either the main build environment or by using ShrinkWrap. Using resource adapters within the test cases will allow you to debug both the resource adapters themself or the JCA container.

The resource adapters represent the [given] facts of our BDD tests, the deployment of the resource adapters represent the [when] phase, while the [then] phase is verified by assertion.

Note that some tests consider an exception a normal output condition using the JUnit 4.x @Exception(expected = "SomeClass.class") annotation to identify and verify this situation.

We are mocking our input/output conditions in our unit tests using the Mockito framework to verify class and method behaviors.

An example:



@Test
public void printFailuresLogShouldReturnNotEmptyStringForWarning() throws Throwable
{
   //given
   RADeployer deployer = new RADeployer();
   File mockedDirectory = mock(File.class);
   given(mockedDirectory.exists()).willReturn(false);
   Failure failure = mock(Failure.class);
   given(failure.getSeverity()).willReturn(Severity.WARNING);
   List failures = Arrays.asList(failure);
   FailureHelper fh = mock(FailureHelper.class);
   given(fh.asText((ResourceBundle) anyObject())).willReturn("myText");
  
   deployer.setArchiveValidationFailOnWarn(true);
  
   //when
   String returnValue = deployer.printFailuresLog(null, mock(Validator.class), 
                                                  failures, mockedDirectory, fh);
  
   //then
   assertThat(returnValue, is("myText"));
}
      

As you can see the BDD style respects the test method name and using the given-when-then sequence in order.

In addition to the test suite the IronJacamar project deploys various tools to increase the stability of the project.

The following sections will describe each of these tools.

Performance testing can identify areas that needs to be improved or completely replaced.

IronJacamar features a basic performance test suite that tests interaction with a transaction manager.

The test suite is executed by

ant perf-test
      

which will run the tests, and output its data into the generated JUnit output.

The setup of the performance test suite is controlled in the

org.jboss.jca.core.tx.perf.Performance
      

class, where the following settings can be altered

A report can be generated using

org.jboss.jca.core.tx.perf.PerfReport
      

which takes 3 arguments; output from NoopTS run, output from Narayana/MEM run and Narayana/FILE run.

The data is presented on the console, and a GNU plot script is generated.

The GNU plot can be generated using

gnuplot perf.plot
      

which will generate a perf.png file with the graphs.

Performance reports can be averaged using

org.jboss.jca.core.tx.perf.AvgReport
      

which takes the .dat files from the PerfReport applications and generates a perf-avg.dat and a perf-avg.plot file.

There is integration with JProfiler through the

ant jprofiler
      

task. It is required to define the installation directory and the session id before the task is executed.

The Bash scripts perf-jprofiler.sh and perf-flightrecorder.sh, both located in core/src/test/resource can be used as a template for command line based runs.