From: Matthias Maier Date: Thu, 20 Aug 2015 20:28:50 +0000 (-0500) Subject: Documentation: Document user testsuite feature X-Git-Tag: v8.4.0-rc2~567^2~1 X-Git-Url: https://gitweb.dealii.org/cgi-bin/gitweb.cgi?a=commitdiff_plain;h=50c7635220d8cb80cb76cd35ecbacf31d25f3609;p=dealii.git Documentation: Document user testsuite feature This adds a bunch of documentation on how to use the testsuite facilities in user projects. Closes #1249 In reference to #1246 --- diff --git a/cmake/macros/macro_deal_ii_pickup_tests.cmake b/cmake/macros/macro_deal_ii_pickup_tests.cmake index 2a86367355..5909566eaa 100644 --- a/cmake/macros/macro_deal_ii_pickup_tests.cmake +++ b/cmake/macros/macro_deal_ii_pickup_tests.cmake @@ -29,6 +29,10 @@ # TEST_LIBRARIES_RELEASE # - specifying additional libraries (and targets) to link against. # +# TEST_TARGET or +# TEST_TARGET_DEBUG and TEST_TARGET_RELEASE +# - specifying a test target to be executed for a parameter run. +# # TEST_TIME_LIMIT # - specifying the maximal wall clock time in seconds a test is # allowed to run diff --git a/doc/documentation.html b/doc/documentation.html index 96274862d8..b61fb7f41a 100644 --- a/doc/documentation.html +++ b/doc/documentation.html @@ -30,6 +30,7 @@ installation instructions
  • CMake documentation
  • CMake in user projects
  • +
  • Setting up testsuite in user projects
  • Tutorial
  • Manual
  • Wolfgang's lectures
  • diff --git a/doc/users/testsuite.html b/doc/users/testsuite.html new file mode 100644 index 0000000000..544195ebed --- /dev/null +++ b/doc/users/testsuite.html @@ -0,0 +1,291 @@ + + + + + The deal.II Testsuite + + + + + + + + +

    Setting up testsuite in user projects

    + +

    + This page provides details about how to set up a testsuite in a user + project similar to the one that is used to test deal.II itself. +

    + +
    +
      +
    1. Overview
    2. +
    3. Examples
    4. +
        +
      1. Simple configuration
      2. +
      3. Advanced configuration
      4. +
      +
    +
    + + +

    Overview

    + +

    + deal.II features an extensive testsuite to ensure + consistent, well-defined behavior of its building blocks during + development and for releases. But, the larger a user program/project + becomes, the more important it is to also check user code for + continued correctness during development. This is mainly done via + unit and + regression testing. +

    + +

    + deal.II provides a mechanism to conveniently set up unit and + regression tests in a user project (very much like they are handled + in the library itself). At its heart a test is a small executable + that is invoked and an output file for comparison. The executable + that should be run can be defined in two different ways: Either as a + source file in conjunction with a comparison file: +

    +my_test_1.cc
    +my_test_1.output
    + In this case my_test_1.cc contains a full executable (with + a main function) that produces some output. The screen output is then + compared against my_test_1.output. Alternatively, a + parameter file together with a comparison file can be provided: +
    +my_test_2.prm
    +my_test_2.output
    + In this case an already built executable (that is defined by a CMake + variable) is invoked with the path of my_test_2.prm as + first argument. Again, its screen output is compared against + my_test_2.output +

    + + +

    Examples

    + +

    + This section presents two different examples of how to use the + testsuite facilities. Possible directory layouts together with the + necessary CMake configuration are discussed. +

    + + +

    Simple configuration

    + +

    + For the purpose of an example, let us pretend that step-1 could read + input files (defined on the command line) and do some computation + based on their contents. Then, we can set up tests for expected + output for given a given configuration file. + + This can be done by creating a subdirectory tests and + adding a test of the second type (i.e., parameter file and comparison + file). In detail the directory and file layout is as follows: +

    +CMakeLists.txt
    +step-1.cc
    +tests/CMakeLists
    +tests/my_test.output
    +tests/my_test.prm
    + In order to enable testing the top-level CMakeLists.txt + file has to be augmented by a call to ENABLE_TESTING() + and a subsequent descent into the tests/ subdirectory via + ADD_SUBDIRECTORY(tests). For convenience, here is the + full top-level CMakeLists.txt file: +
    +SET(TARGET step-1)
    +SET(TARGET_SRC step-1.cc)
    +
    +CMAKE_MINIMUM_REQUIRED(VERSION 2.8.8)
    +FIND_PACKAGE(deal.II 8.3 REQUIRED
    +  HINTS ${deal.II_DIR} ${DEAL_II_DIR} ../ ../../ $ENV{DEAL_II_DIR}
    +  )
    +
    +DEAL_II_INITIALIZE_CACHED_VARIABLES()
    +PROJECT(${TARGET})
    +DEAL_II_INVOKE_AUTOPILOT()
    +
    +# Enable testing and descent into tests/ subdirectory:
    +ENABLE_TESTING()
    +ADD_SUBDIRECTORY(tests)
    + The corresponding file tests/CMakeLists.txt contains + only two statements: +
    +SET(TEST_TARGET ${TARGET})
    +DEAL_II_PICKUP_TESTS()
    + The first statement sets the variable TEST_TARGET to the + executable that should be invoked (in our case the contents of the + variable TARGET). The second statement is a call to a + deal.II macro that will go through the directory contents and define + all test targets. +

    + +

    Due to the fact that step-1 produces only two lines of output and + parses no parameters, we can set up a somewhat silly test by just + providing the comparison file and an empty parameter file: +

    +$ touch tests/my_test.prm
    +$ echo "Grid written to grid-1.eps" >  tests/my_test.output
    +$ echo "Grid written to grid-2.eps" >> tests/my_test.output
    + After that, reconfigure and call the test driver ctest: +
    +$ cmake .
    +[...]
    +$ ctest
    +Test project .../examples/step-1
    +    Start 1: tests/my_test.debug
    +1/1 Test #1: tests/my_test.debug ..............   Passed    1.72 sec
    +
    +100% tests passed, 0 tests failed out of 1
    +
    +Total Test time (real) =   1.72 sec
    +

    + +

    + Remark: The test driver will compare the combined output + stream of stdout and stderr against the comparison file. If the test + creates a file output and writes to it, the comparison + file is compared against this output file instead. In this case + stdout and stderr are discarded. +

    + + +

    Advanced configuration

    + +

    + Above setup is too inflexible for larger projects that might consist + of individual libraries and an independent main program. Therefore, + as a second example a project is presented that consists of a support + library "support" and an executable "step". The task shall be to + provide unit tests for the library "support" and simple configuration + type tests for "step". In detail: +

    +CMakeLists.txt
    +
    +src/CMakeLists.txt
    +src/step.cc
    +src/support.cc
    +
    +tests/step/CMakeLists
    +tests/step/my_test_1.prm
    +tests/step/my_test_1.output
    +
    +tests/support/CMakeLists
    +tests/support/my_test_2.cc
    +tests/support/my_test_2.output
    +

    + +

    + Again, we want to use the "autopilot" configuration for user projects + (see the cmake documentation for + details). The top-level CMakeLists.txt is now solely + responsible for finding deal.II, enable testing, and descending into + subdirectories: +

    +# top-level CMakelists.txt
    +
    +CMAKE_MINIMUM_REQUIRED(VERSION 2.8.8)
    +FIND_PACKAGE(deal.II 8.3 REQUIRED
    +  HINTS ${deal.II_DIR} ${DEAL_II_DIR} ../ ../../ $ENV{DEAL_II_DIR}
    +  )
    +
    +DEAL_II_INITIALIZE_CACHED_VARIABLES()
    +PROJECT(step)
    +ENABLE_TESTING()
    +
    +ADD_SUBDIRECTORY(src)
    +ADD_SUBDIRECTORY(tests/step)
    +ADD_SUBDIRECTORY(tests/support)
    + The library and executable are defined in + src/CMakeLists.txt: +
    +# src/CMakeLists.txt
    +
    +# set up shared library by hand:
    +ADD_LIBRARY(support SHARED support.cc)
    +DEAL_II_SETUP_TARGET(support)
    +
    +# set up executable with autopilot macro:
    +SET(TARGET "step")
    +SET(TARGET_SRC step.cc)
    +DEAL_II_INVOKE_AUTOPILOT()
    +TARGET_LINK_LIBRARIES(${TARGET} support)
    + + Similarly to the first example, setting up tests for the executable + "step" is just a matter of defining a variable and a call to a macro: +
    +# tests/step/CMakeLists.txt
    +
    +SET(TEST_TARGET step)
    +DEAL_II_PICKUP_TESTS()
    + In contrast, the tests for the support library consist of a source file + that has a main function. The object file generated from this source + file will be linked against deal.II and every library listed in + TEST_LIBRARIES: +
    +# tests/support/CMakeLists.txt
    +
    +SET(TEST_LIBRARIES support)
    +DEAL_II_PICKUP_TESTS()
    + Again, reconfigure and run ctest: +
    +$ cmake .
    +$ ctest
    +Test project .../examples/step
    +    Start 1: step/my_test_1.debug
    +1/2 Test #1: step/my_test_1.debug .............   Passed    0.21 sec
    +    Start 2: support/my_test_2.debug
    +2/2 Test #2: support/my_test_2.debug ..........   Passed    0.22 sec
    +
    +100% tests passed, 0 tests failed out of 2
    +
    +Total Test time (real) =   0.43 sec
    +

    + +

    + Remark: For further information consult the testsuite documentation for + the library. With the sole exception of the testsuite setup (that + happens unconditionally in user testsuites), this documentation also + applies for user testsuites. +

    + +

    + Remark: The full configuration options for + DEAL_II_PICKUP_TESTS() are: +

    +
    +TEST_LIBRARIES
    +TEST_LIBRARIES_DEBUG
    +  - additionally used for tests with debug configuration
    +TEST_LIBRARIES_RELEASE
    +  - additionally used for tests with release configuration
    +
    +TEST_TARGET or
    +TEST_TARGET_DEBUG and TEST_TARGET_RELEASE
    +  - used instead of TEST_TARGET for debug/release configuration
    +
    +NUMDIFF_EXECUTABLE, DIFF_EXECUTABLE
    +  - pointing to valid diff executables. If NUMDIFF_EXECUTABLE is not
    +    "numdiff" it will be ignored and DIFF_EXECUTABLE is used instead
    +
    +TEST_TIME_LIMIT
    +  - specifying the maximal wall clock time in seconds a test is allowed
    +    to run
    +
    +
    + + Valid HTML 4.01! + + Valid CSS! +
    + + +