Chapter 5. Testing

5.1. Overall goals

The overall goals of our test environment is to execute tests that ensures that we have full coverage of the JCA specification as well as our implementation.

The full test suite is executed using

ant test
    

A single test case can be executed using

ant -Dmodule=embedded -Dtest=org.jboss.jca.embedded.unit.ShrinkWrapTestCase one-test
    

where -Dmodule specifies which module to execute the test case in. This parameter defaults to core. The -Dtest parameter specifies the test case itself.

You can also execute all test cases of a single module using

ant -Dmodule=embedded module-test
    

where -Dmodule specifies which module to execute the test cases in. This parameter defaults to core.

The build script does not fail in case of test errors or failure.

You can control the behavior by using the junit.haltonerror and junit.haltonfailure properties in the main build.xml file. Default value for both is no.

You can of course change them statically in the build.xml file or temporary using -Djunit.haltonerror=yes. There are other jnuit.* properties defined in the main build.xml that can be controlled in the same way.

5.1.1. Specification

The purpose of the specification tests is to test our implementation against the actual specification text.

Each test can only depend on:

  • The official Java Connector Architecture API (javax.resource)

  • Interfaces and classes in the test suite that extends/implements the official API

The test cases should be created in such a way such that they are easily identified by chapter, section and paragraph. For example:

org.jboss.jca.core.spec.chaper10.section3
      

5.1.2. JBoss specific interfaces

The purpose of the JBoss specific interfaces tests is to test our specific interfaces.

Each test can depend on:

  • The official Java Connector Architecture API (javax.resource)

  • The JBoss JCA specific APIs (org.jboss.jca.xxx.api)

  • Interfaces and classes in the test suite that extends/implements these APIs

The test cases lives in a package that have a meaningful name of the component it tests. For example:

org.jboss.jca.core.workmanager
      

These test cases can use both the embedded JCA environment or be implemented as standard POJO based JUnit test cases.

5.1.3. JBoss specific implementation

The purpose of the JBoss specific implementation tests is to test our specific implementation. These tests should cover all methods are not exposed through the interface.

Each test can depend on:

  • The official Java Connector Architecture API (javax.resource)

  • The JBoss JCA specific APIs (org.jboss.jca.xxx.api)

  • The JBoss JCA specific implementation (org.jboss.jca.xxx.yyy)

  • Interfaces and classes in the test suite

The test cases lives in a package that have a meaningful name of the component it tests. For example:

org.jboss.jca.core.workmanager
      

These test cases can use both the embedded JCA environment or be implemented as standard POJO based JUnit test cases.

5.2. Testing principle and style

Our tests follows the Behavior Driven Development (BDD) technique. In BDD you focus on specifying the behaviors of a class and write code (tests) that verify that behavior.

You may be thinking that BDD sounds awfully similar to Test Driven Development (TDD). In some ways they are similar: they both encourage writing the tests first and to provide full coverage of the code. However, TDD doesn't really provide a guide on which kind of tests you should be writing.

BDD provides you with guidance on how to do testing by focusing on what the behavior of a class is supposed to be. We introduce BDD to our testing environment by extending the standard JUnit 4.x test framework with BDD capabilities using assertion and mocking frameworks.

The BDD tests should

  • Clearly define given-when-then conditions

  • The method name defines what is expected: f.ex. shouldReturnFalseIfMethodXIsCalledWithNullString()

  • Easy to read the assertions by using Hamcrest Matchers

  • Use given facts whenever possible to make the test case more readable. It could be the name of the deployed resource adapter, or using the BDD Mockito class to mock the fact.

We are using two different kind of tests:

  • Integration Tests: The goal of these test cases is to validate the whole process of deployment, and interacting with a sub-system by simulating a critical condition.

  • Unit Tests: The goal of these test cases is to stress test some internal behaviour by mocking classes to perfectly reproduce conditions to test.

5.2.1. Integration Tests

The integration tests simulate a real condition using a particular deployment artifacts packaged as resource adapters.

The resource adapters are created using either the main build environment or by using ShrinkWrap. Using resource adapters within the test cases will allow you to debug both the resource adapters themself or the JCA container.

The resource adapters represent the [given] facts of our BDD tests, the deployment of the resource adapters represent the [when] phase, while the [then] phase is verified by assertion.

Note that some tests consider an exception a normal output condition using the JUnit 4.x @Exception(expected = "SomeClass.class") annotation to identify and verify this situation.

5.2.2. Unit Tests

We are mocking our input/output conditions in our unit tests using the Mockito framework to verify class and method behaviors.

An example:

@Test
public void printFailuresLogShouldReturnNotEmptyStringForWarning() throws Throwable
{
   //given
   RADeployer deployer = new RADeployer();
   File mockedDirectory = mock(File.class);
   given(mockedDirectory.exists()).willReturn(false);

   Failure failure = mock(Failure.class);
   given(failure.getSeverity()).willReturn(Severity.WARNING);

   List failures = Arrays.asList(failure);
   FailureHelper fh = mock(FailureHelper.class);
   given(fh.asText((ResourceBundle) anyObject())).willReturn("myText");
  
   deployer.setArchiveValidationFailOnWarn(true);
  
   //when
   String returnValue = deployer.printFailuresLog(null, mock(Validator.class), 
                                                  failures, mockedDirectory, fh);
  
   //then
   assertThat(returnValue, is("myText"));
}
      

As you can see the BDD style respects the test method name and using the given-when-then sequence in order.

5.3. Quality Assurance

In addition to the test suite the JBoss JCA project deploys various tools to increase the stability of the project.

The following sections will describe each of these tools.

5.3.1. Checkstyle

Checkstyle is a tool that verifies that the formatting of the source code in the project is consistent.

This allows for easier readability and a consistent feel of the project.

The goal is to have zero errors in the report. The checkstyle report is generated using

ant checkstyle
      

The report is generated into

reports/checkstyle
      

The home of checkstyle is located here: http://checkstyle.sourceforge.net/.

5.3.2. Findbugs

Findbugs is a tool that scans your project for bugs and provides reports based on its findings.

This tool helps lower of the number of bugs found in the JBoss JCA project.

The goal is to have zero errors in the report and as few exclusions in the filter as possible. The findbugs report is generated using

ant findbugs
      

The report is generated into

reports/findbugs
      

The home of findbugs is located here: http://findbugs.sourceforge.net/.

5.3.3. Cobertura

Cobertura generates a test suite matrix for your project which helps you identify where you need additional test coverage.

The reports that the tool provides makes sure that the JBoss JCA project has the correct test coverage.

The goal is to have as high code coverage as possible in all areas. The Cobertura report is generated using

ant cobertura
      

The report is generated into

reports/cobertura
      

The home of Cobertura is located here: http://cobertura.sourceforge.net/.

5.3.4. Tattletale

Tattletale generates reports about different quality matrix of the dependencies within the project.

The reports that the tool provides makes sure that the JBoss JCA project doesn't for example have cyclic dependencies within the project.

The goal is to have as no issues flagged by the tool. The Tattletale reports are generated using

ant tattletale
      

The reports are generated into

reports/tattletale
      

The home of Tattletale is located here: http://www.jboss.org/tattletale.