Difference between revisions of "Tandards for Testing"
(Created page with "DREAM software will be subject to a spectrum of test procedures, including black-box unit tests, white-box structural tests, regression tests, and acceptance tests. ==Black-b...") |
|||
Line 1: | Line 1: | ||
− | DREAM software will be subject to a spectrum of test procedures, including black-box unit tests, white-box structural tests, regression tests, and acceptance tests. | + | DREAM software will be subject to a spectrum of test procedures, including black-box unit tests, white-box structural tests, regression tests, and acceptance tests. Passing a black-box test forms part of the criteria for integration into the DREAM release. |
==Black-box Unit Tests== | ==Black-box Unit Tests== | ||
Black-box testing is a testing strategy that checks the behaviour of a software unit — in this case a component — without being concerned with what is going on inside the unit; hence the term “black-box”. Typically, it does this by providing a representative sample of input data (both valid and invalid), by describing the expected results, and then by running the component against this test data to see whether or not the expected results are achieved. | Black-box testing is a testing strategy that checks the behaviour of a software unit — in this case a component — without being concerned with what is going on inside the unit; hence the term “black-box”. Typically, it does this by providing a representative sample of input data (both valid and invalid), by describing the expected results, and then by running the component against this test data to see whether or not the expected results are achieved. | ||
− | |||
Component software developers must provide a unit test with every component submitted for integration into the DREAM release. This test comprises two parts: an XML application and a test description. | Component software developers must provide a unit test with every component submitted for integration into the DREAM release. This test comprises two parts: an XML application and a test description. | ||
+ | * The XML application should launch the component being tested and connect it through its ports to a data source and a data sink. | ||
− | The | + | * The sources and sinks can be files or driver and stub components written specifically by the developer to provide the test data. |
+ | * The application file should be named after the component but with the suffix TEST, for example <code>protoComponentTEST.xml</code>. | ||
− | + | * Source and sink data files should be located in the <code>config</code> directory and, if they are used, driver and stub components should be located in src directory so that they will be compiled and linked along with the component being tested. | |
+ | * The test application file should be located in the <code>app</code> directory. | ||
− | + | Instructions on how to run the test should be included in a <code>README.txt</code> file, also located in the <code>app</code> directory. | |
+ | In general, these instructions should describe the nature of the test data and the expected results, and it should explain how these results validate the expected behaviour of the component. In particular, these instructions should explain how the four Cs of communication, configuration, computation, and coordination are exercised by the test. | ||
− | + | * Validation of communication and computation functionality will typically be established by describing the (sink) output data that will be produced from the (source) input data. | |
+ | * Validation of configuration functionality will typically be established by stating what changes in behaviour should occur if the values for the component parameters in the component configuration (.ini) file are altered. | ||
− | + | * Validation of coordination functionality will typically be established by stating what changes in behaviour should occur when commands are issued interactively by the user to the component using the port named after the component itself. | |
+ | == White-box Structural Tests == | ||
+ | White-box testing is a testing strategy that checks the behaviour of a software unit or collection of units in a (sub-)systems by exercising all the possible executions paths within that unit or system. Thus, white-box testing different fundamentally from black-box testing in that it is explicitly concerned with what is going on inside the unit. | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
In DREAM we will perform white-box testing on a system-level only. Components will not be subject to white-box testing when being integrated into the DREAM software release, although developers themselves are encouraged to use white-box testing before submitting the component for integration. | In DREAM we will perform white-box testing on a system-level only. Components will not be subject to white-box testing when being integrated into the DREAM software release, although developers themselves are encouraged to use white-box testing before submitting the component for integration. | ||
− | White-box testing will be effected by checking removing the driver and stub functions that | + | |
− | + | White-box testing will be effected by checking removing the driver and stub functions that simulate the output and input of data in the top-level system architecture, allowing that source and sink functionality to be provided instead by the component being integrated. This will establish whether or not the component in question adheres to the required data-flow protocol. Thus, this white-box test will establish the validity of the inter-component communication but not its functionality (which is tested using the black-box unit test). | |
+ | |||
+ | ==Regression Tests== | ||
Regression testing refers to the practice of re-running all integration tests — black-box and white- box — periodically to ensure that no unintentional changes has been introduced during the ongoing development of the DREAM software release. These test check for backward compatability, ensuring that what used to work in the past remains working. Regression tests will be carried out on all software in the DREAM release every two months. | Regression testing refers to the practice of re-running all integration tests — black-box and white- box — periodically to ensure that no unintentional changes has been introduced during the ongoing development of the DREAM software release. These test check for backward compatability, ensuring that what used to work in the past remains working. Regression tests will be carried out on all software in the DREAM release every two months. | ||
− | + | ||
+ | ==Acceptance Tests== | ||
The DREAM software will be subject periodic qualitative assessment by the DREAM psychotherapist practitioners. The specific goal of these tests will be to validate the behaviour and performance of the system against the user requirements set out in deliverables D1.1, D1.2, and D1.3. These tests will take place whenever a new version of the DREAM software release it made available to the practitioners. | The DREAM software will be subject periodic qualitative assessment by the DREAM psychotherapist practitioners. The specific goal of these tests will be to validate the behaviour and performance of the system against the user requirements set out in deliverables D1.1, D1.2, and D1.3. These tests will take place whenever a new version of the DREAM software release it made available to the practitioners. |
Revision as of 06:23, 29 August 2014
DREAM software will be subject to a spectrum of test procedures, including black-box unit tests, white-box structural tests, regression tests, and acceptance tests. Passing a black-box test forms part of the criteria for integration into the DREAM release.
Black-box Unit Tests
Black-box testing is a testing strategy that checks the behaviour of a software unit — in this case a component — without being concerned with what is going on inside the unit; hence the term “black-box”. Typically, it does this by providing a representative sample of input data (both valid and invalid), by describing the expected results, and then by running the component against this test data to see whether or not the expected results are achieved.
Component software developers must provide a unit test with every component submitted for integration into the DREAM release. This test comprises two parts: an XML application and a test description.
- The XML application should launch the component being tested and connect it through its ports to a data source and a data sink.
- The sources and sinks can be files or driver and stub components written specifically by the developer to provide the test data.
- The application file should be named after the component but with the suffix TEST, for example
protoComponentTEST.xml
.
- Source and sink data files should be located in the
config
directory and, if they are used, driver and stub components should be located in src directory so that they will be compiled and linked along with the component being tested.
- The test application file should be located in the
app
directory.
Instructions on how to run the test should be included in a README.txt
file, also located in the app
directory.
In general, these instructions should describe the nature of the test data and the expected results, and it should explain how these results validate the expected behaviour of the component. In particular, these instructions should explain how the four Cs of communication, configuration, computation, and coordination are exercised by the test.
- Validation of communication and computation functionality will typically be established by describing the (sink) output data that will be produced from the (source) input data.
- Validation of configuration functionality will typically be established by stating what changes in behaviour should occur if the values for the component parameters in the component configuration (.ini) file are altered.
- Validation of coordination functionality will typically be established by stating what changes in behaviour should occur when commands are issued interactively by the user to the component using the port named after the component itself.
White-box Structural Tests
White-box testing is a testing strategy that checks the behaviour of a software unit or collection of units in a (sub-)systems by exercising all the possible executions paths within that unit or system. Thus, white-box testing different fundamentally from black-box testing in that it is explicitly concerned with what is going on inside the unit.
In DREAM we will perform white-box testing on a system-level only. Components will not be subject to white-box testing when being integrated into the DREAM software release, although developers themselves are encouraged to use white-box testing before submitting the component for integration.
White-box testing will be effected by checking removing the driver and stub functions that simulate the output and input of data in the top-level system architecture, allowing that source and sink functionality to be provided instead by the component being integrated. This will establish whether or not the component in question adheres to the required data-flow protocol. Thus, this white-box test will establish the validity of the inter-component communication but not its functionality (which is tested using the black-box unit test).
Regression Tests
Regression testing refers to the practice of re-running all integration tests — black-box and white- box — periodically to ensure that no unintentional changes has been introduced during the ongoing development of the DREAM software release. These test check for backward compatability, ensuring that what used to work in the past remains working. Regression tests will be carried out on all software in the DREAM release every two months.
Acceptance Tests
The DREAM software will be subject periodic qualitative assessment by the DREAM psychotherapist practitioners. The specific goal of these tests will be to validate the behaviour and performance of the system against the user requirements set out in deliverables D1.1, D1.2, and D1.3. These tests will take place whenever a new version of the DREAM software release it made available to the practitioners.