Difference between revisions of "Software Integration Guide"

From David Vernon's Wiki
Jump to: navigation, search
Line 11: Line 11:
 
== Component Black-box Unit Testing ==
 
== Component Black-box Unit Testing ==
  
Black-box testing is a testing strategy that checks the behaviour of a software unit — in this case a component — without being concerned with what is going on inside the unit; hence the term “black- box”. Typically, it does this by providing a representative sample of input data (both valid and invalid), by describing the expected results, and then by running the component against this test data to see whether or not the expected results are achieved.
+
* Black-box testing is a testing strategy that checks the behaviour of a software unit — in this case a component — without being concerned with what is going on inside the unit; hence the term “black- box”. Typically, it does this by providing a representative sample of input data (both valid and invalid), by describing the expected results, and then by running the component against this test data to see whether or not the expected results are achieved.
  
 
+
'''Component software developers must provide a unit test with every component submitted for integration into the DREAM release.''' This test comprises two parts: an XML application and a test description.
Component software developers must provide a unit test with every component submitted for integration into the DREAM release. This test comprises two parts: an XML application and a test description.
+
  
 
===The Component Test Application ===
 
===The Component Test Application ===
  
The application file should be named after the component but with the suffix TEST, for example <code>protoComponentTEST.xml</code>.
+
* The application file should be named after the component but with the suffix TEST, for example <code>protoComponentTEST.xml</code>.
 
+
 
+
The XML application should launch the component being tested and connect it through its ports to a data source and a data sink.
+
 
+
 
+
The sources and sinks can be files or driver and stub components written specifically by the developer to provide the test data.
+
  
 +
* The XML application should launch the component being tested and connect it through its ports to a data source and a data sink.
  
Source and sink data files should be located in the component's <code>config</code> directory and, if they are used, driver and stub components should be located in <code> src </code> directory so that they will be compiled and linked along with the component being tested.  
+
* The sources and sinks can be files or driver and stub components written specifically by the developer to provide the test data.
  
 +
* Source and sink data files should be located in the component's <code>config</code> directory and, if they are used, driver and stub components should be located in <code> src </code> directory so that they will be compiled and linked along with the component being tested.
  
The test application file should be located in the <code>app</code> directory.
+
* The test application file should be located in the <code>app</code> directory.
  
 
=== The Component Test Description ===
 
=== The Component Test Description ===
  
Instructions on how to run the test should be included in a README.txt file, also located in the component's <code>app</code> directory.
+
* Instructions on how to run the test should be included in a README.txt file, also located in the component's <code>app</code> directory.
  
  
In general, these instructions should describe the nature of the test data and the expected results, and it should explain how these results validate the expected behaviour of the component. In particular, these instructions should explain how the four Cs of communication, configuration, computation, and coordination are exercised by the test.
+
* In general, these instructions should describe the nature of the test data and the expected results, and it should explain how these results validate the expected behaviour of the component. In particular, these instructions should explain how the four Cs of communication, configuration, computation, and coordination are exercised by the test.
  
  
Validation of communication and computation functionality will typically be established by describing the (sink) output data that will be produced from the (source) input data.
+
* Validation of communication and computation functionality will typically be established by describing the (sink) output data that will be produced from the (source) input data.
  
  
Validation of configuration functionality will typically be established by stating what changes in behaviour should occur if the values for the component parameters in the component configuration (.ini) file are altered.
+
* Validation of configuration functionality will typically be established by stating what changes in behaviour should occur if the values for the component parameters in the component configuration (.ini) file are altered.
  
  
Validation of coordination functionality will typically be established by stating what changes in behaviour should occur when commands are issued interactively by the user to the component using the port named after the component itself.
+
* Validation of coordination functionality will typically be established by stating what changes in behaviour should occur when commands are issued interactively by the user to the component using the port named after the component itself.
  
 
== System White-box Testing ==
 
== System White-box Testing ==

Revision as of 01:06, 1 September 2014

Overview

The software integration guide describes the procedures for unit testing of individual components and submitting them for integration.


Each component submitted for integration will be check against the list of quality assurance criteria described in Deliverable 3.3. If it passes all tests, it will then be moved to the DREAM release directory and subjected to a white-box systems test. This test will suppress the port I/O in the relevant placeholder system component for all the ports used by the component being integrated.


A description of the system architecture, i.e. the placeholder system components and their associated ports, is included in the System Architecture Guide for reference. This is derived from Deliverable D3.1.

Component Black-box Unit Testing

  • Black-box testing is a testing strategy that checks the behaviour of a software unit — in this case a component — without being concerned with what is going on inside the unit; hence the term “black- box”. Typically, it does this by providing a representative sample of input data (both valid and invalid), by describing the expected results, and then by running the component against this test data to see whether or not the expected results are achieved.

Component software developers must provide a unit test with every component submitted for integration into the DREAM release. This test comprises two parts: an XML application and a test description.

The Component Test Application

  • The application file should be named after the component but with the suffix TEST, for example protoComponentTEST.xml.
  • The XML application should launch the component being tested and connect it through its ports to a data source and a data sink.
  • The sources and sinks can be files or driver and stub components written specifically by the developer to provide the test data.
  • Source and sink data files should be located in the component's config directory and, if they are used, driver and stub components should be located in src directory so that they will be compiled and linked along with the component being tested.
  • The test application file should be located in the app directory.

The Component Test Description

  • Instructions on how to run the test should be included in a README.txt file, also located in the component's app directory.


  • In general, these instructions should describe the nature of the test data and the expected results, and it should explain how these results validate the expected behaviour of the component. In particular, these instructions should explain how the four Cs of communication, configuration, computation, and coordination are exercised by the test.


  • Validation of communication and computation functionality will typically be established by describing the (sink) output data that will be produced from the (source) input data.


  • Validation of configuration functionality will typically be established by stating what changes in behaviour should occur if the values for the component parameters in the component configuration (.ini) file are altered.


  • Validation of coordination functionality will typically be established by stating what changes in behaviour should occur when commands are issued interactively by the user to the component using the port named after the component itself.

System White-box Testing

White-box testing is a testing strategy that checks the behaviour of a software unit or collection of units in a (sub-)systems by exercising all the possible executions paths within that unit or system. Thus, white-box testing different fundamentally from black-box testing in that it is explicitly concerned with what is going on inside the unit.


In DREAM we will perform white-box testing on a system-level only. Components will not be subject to white-box testing when being integrated into the DREAM software release, although developers themselves are encouraged to use white-box testing before submitting the component for integration.


White-box testing will be effected by checking removing the driver and stub functions that simulate the output and input of data in the top-level system architecture, allowing that source and sink functionality to be provided instead by the component being integrated. This will establish whether or not the component in question adheres to the required data-flow protocol. Thus, this white-box test will establish the validity of the inter-component communication but not its functionality (which is tested using the black-box unit test).