Skip to content
Laurent Rineau edited this page Apr 4, 2024 · 3 revisions

Table of Contents

Introduction

The CGAL test suite is a way to test the compilation and execution of CGAL programs automatically (i.e. without user interaction) on a number of different platforms. A platform in this context refers to a combination of an operating system, a compiler, and a set of flags provided in the compilation command-line. Examples of the latter are '-g' for compiling with g++ in the so-called debug mode, and '-O2 -DNDEBUG' for compiling with g++ in the so-called release mode. Developers should, naturally, thoroughly test their code on their own development platform(s) before committing it. The test suite is not intended for initial testing of code, and it serves mainly as a way to test on additional platforms not available to the developer.

The test suite helps the developer(s) of a package to

  • detect compilation problems on the various platforms,
  • detect runtime problems, and
  • check the correctness of the algorithms in the package

Requirements:

  • Test your code thoroughly before committing it.
  • Obey the directory structure detailed in this subsection.
  • Check the test suite results for your package regularly.

Recommendations:

  • Cover the complete code of the package; every (member) function should be called at least once (see this section for a description of a tool you can use to test code coverage).
  • Use more than one instantiation of templated functions or classes.
  • A lot of classes in CGAL can be parameterized by traits classes, so that they are usable with different kernels. In such cases more than one kernel should be used for testing.
  • Recall that the CGAL checks, such as CGAL_precondition and CGAL_postcondition are disabled by the CGAL_NDEBUG macro, which is set when the flag '-DNDEBUG' appears on the compilation command-line; see CGAL Developer Manual. Thus, in the test-suite code itself you should use assert and not CGAL_assertion if you want the check to persist when your test is run in a mode where CGAL_assertion is disabled (but assert is not). Naturally, it is encouraged to use pre- and postcondition checks wherever it is possible in the tested code.
  • Not contain a makefile unless it needs to do something very special to compile or link. If you find you want to do something very special in your makefile, think long and hard about whether it's really necessary or not. See also this section.
  • Not contain the script cgal_test_with_cmake. In the special case it has to, see this section.
  • Set the CMake option CGAL_DEV_MODE, while developing CGAL.

Test suite directory

The test suite is located in the directory test/ of the internal releases (see also this wiki page on the creation of internal releases). The test/ directory is not part of external releases. It contains:

  • a script run_testsuite_with_cmake that is (not surprisingly) used to run the test suite.
  • a subdirectory for every package included in the internal release. These subdirectories are created from the test/ directories of the packages by copying the source, include, and input files from these directories and adding makefiles and cgal_test_with_cmake scripts where needed. See this subsection for more information about the proper structure of the test/ directory for a package.
  • a subdirectory with a name that ends in _Examples/ for every package that was submitted with an examples directory.
  • a subdirectory with a name that ends in _Demo/ for every package that was submitted with a demo directory.

The test suite will attempt to compile all the programs in the subdirectories of test/ and to run all except the demo programs (which usually require user interaction) by using the cgal_test_with_cmake scripts (Sections test subdirectory and create_cgal_test) and will save the results in files in the package subdirectories. Even if a program fails to compile or run, the test suite will continue.

Test suite input

Input to programs in the test suite can be supplied in three different ways:

Data files

The data files in the data/ directory: As described in Section test subdirectory, a package's test/ directory may contain a subdirectory data/ that contains input files for the test programs.

*.cin files

If a test program program.cpp requires input from standard input (i.e. cin), you should put a file called program.cin in the test directory. The test suite will then execute the program using the command ./program < program.cin

*.cmd files

If a test program program.cpp requires input from the command line (i.e. -f *.off ), you should put a file called program.cmd in the test directory. For each line in program.cmd, the testsuite will then execute the program with the full line as arguments.

Command-line arguments

Command-line arguments should be supplied in the cgal_test_with_cmake script: You are discouraged from using this option to give input values to your programs since it requires you to edit and submit a cgal_test_with_cmake script; see create_cgal_test. However, if a test program program.cpp absolutely requires command-line parameters, you should do the following. Use create_cgal_test_with_cmake to create the script cgal_test_with_cmake. This file contains an entry of the form

compile_and_run program

Just put the command-line parameters for your program at the end of this line:

compile_and_run program  arg1 arg2 ..

The test suite will then execute the program using the command

./program <arg1> <arg2> ...

Using CTest

CTest is a testing tool shipped with CMake. Compared to the testsuite infrastructure of CGAL (the shell script cgal_test), its major benefit is the possibility to run tests in parallel. For the moment, our nightly test suites still use the old shell scripts, but in the future we plan to use CTest. CTest is already supported in CGAL, in the master branch.

Note that CMake version 3.4 is required to use CTest with CGAL.

You can use CTest to test a specific example/test directory or the whole thing. First, you need a CGAL build with examples and tests.

cd CGAL
mkdir build
cd build

Note: if your build directory already existed before, it can be reused without the need to erase its content. Even the current CMakeCache.txt can be used.

Then run CMake with the options:

  • to enable CTest: BUILD_TESTING set to ON,
  • to enable the building of examples and tests in the same build directory: WITH_examples and WITH_tests set to ON. With the command line tool cmake, you can run the following command:
cmake -DBUILD_TESTING=ON -DWITH_examples=ON -DWITH_tests=ON ..

but you can also set those options using the CMake GUI.

If you want to test a particular directory, do:

cd test/Kernel_23

# Note that it is a sub-directory of your build-directory, not the sources.

ctest -j4

There are test lines that build the executable, and others that execute the actual tests. There are dependencies that ensure the binaries are built before they are tested.

Use with multiple build configurations (Release/Debug)

If your CMake build directory uses a multi-configuration generator, then CTest needs to know with which configuration you want to run the tests. In that case, in all the following command line examples, you need to add the -C option, that way:

ctest -j4 -C Release

Run Tests of all CGAL

If you want to test the whole CGAL, just run ctest at the root of your build directory:

[~/CGAL/build/] > ctest -j4

Note: the -j4 option of ctest sets the maximal number of jobs to run in parallel. Set it according to your number of CPU cores.

Run only Specific Tests

With the -R Option

If you want to run only one test, or a selection of them, you can use the -R option of CTest:

ctest -R <pattern>

will run all tests matching the regular expression <pattern>

The option -R <pattern> can be used several times in the same CTest run.

With Labels and the -L Option

Alternatively, you can use labels. All CGAL tests have been assigned labels. All tests in a sub-directory have a label related to the name of the directory. For example, tests in test/Triangulation_2 have the label Triangulation_2_Tests, and tests in examples/Triangulation_2 have the label Triangulation_2_Examples.

With the -L option, you can run tests having a label matching a regular expression. For example:

ctest -L Mesh_2_Tests

will test all of test/Mesh_2/. Another example:

ctest -L Sweep

will run all examples and tests of the Sweep_line_2package. Last example, with a more complex regular expression:

ctest -L 'Tr|Mesh_'

will run all tests from the following directories:

examples/Mesh_2
examples/Mesh_3
examples/RangeSegmentTrees
examples/Triangulation
examples/Triangulation_2
examples/Triangulation_3
test/Mesh_2
test/Mesh_3
test/RangeSegmentTrees
test/Triangulation
test/Triangulation_2
test/Triangulation_3

Verbose Output

If you pass the option -V (or --verbose), you will see the command lines used to run the tests, and the output of the tests. Otherwise, the output is only in the file Testing/Temporary/LastTest.log of your current build directory.

Dry-run of CTest

If you pass -N to the ctest command line, CTest will make a dry-run: it will display the list of all tests that it will launch, instead of launching them. You can pass both the options -N -V to see at the same time the command lines CTest will use.

Example

Here is an example of a partial test of Mesh_3 (a fix of all tests and examples having features in their names), that is run at the root of the build directory $HOME/Git/cgal/build. The CTest options used in that run are:

  • -j3 to run three jobs in parallel,
  • -L Mesh_3 to run only tests with the label Mesh_3
  • -R features to restrict further the list of test cases to those having features in their names

That run was done with a pre-version of CGAL-4.12, without header-only.

$ ctest -j3 -L Mesh_3 -R feature                                                                                      
Test project /home/lrineau/Git/cgal/build
      Start 2793: compilation_of__test_meshing_polyhedron_with_features
      Start 747: compilation_of__mesh_3D_image_with_features
      Start 725: compilation_of__mesh_cubes_intersection_with_features
 1/18 Test #747: compilation_of__mesh_3D_image_with_features ............................   Passed  344.91 sec
 2/18 Test #725: compilation_of__mesh_cubes_intersection_with_features ..................   Passed  344.91 sec
      Start 2805: compilation_of__test_mesh_polyhedral_domain_with_features_deprecated
      Start 737: compilation_of__mesh_polyhedral_domain_with_features
 3/18 Test #2793: compilation_of__test_meshing_polyhedron_with_features ..................   Passed  382.26 sec
      Start 2765: compilation_of__test_c3t3_with_features
 4/18 Test #2765: compilation_of__test_c3t3_with_features ................................   Passed    3.61 sec
      Start 2769: compilation_of__test_domain_with_polyline_features
 5/18 Test #2769: compilation_of__test_domain_with_polyline_features .....................   Passed    4.51 sec
      Start 2751: Mesh_3_Tests_SetupFixture
 6/18 Test #2751: Mesh_3_Tests_SetupFixture ..............................................   Passed    2.40 sec
      Start 713: Mesh_3_Examples_SetupFixture
 7/18 Test #713: Mesh_3_Examples_SetupFixture ...........................................   Passed    2.71 sec
      Start 2794: execution___of__test_meshing_polyhedron_with_features
 8/18 Test #2794: execution___of__test_meshing_polyhedron_with_features ..................   Passed   24.94 sec
      Start 748: execution___of__mesh_3D_image_with_features
 9/18 Test #748: execution___of__mesh_3D_image_with_features ............................   Passed    0.90 sec
      Start 726: execution___of__mesh_cubes_intersection_with_features
10/18 Test #726: execution___of__mesh_cubes_intersection_with_features ..................   Passed    0.30 sec
      Start 2766: execution___of__test_c3t3_with_features
11/18 Test #2766: execution___of__test_c3t3_with_features ................................   Passed    0.03 sec
      Start 2770: execution___of__test_domain_with_polyline_features
12/18 Test #2770: execution___of__test_domain_with_polyline_features .....................   Passed    0.30 sec
13/18 Test #2805: compilation_of__test_mesh_polyhedral_domain_with_features_deprecated ...   Passed  149.06 sec
14/18 Test #737: compilation_of__mesh_polyhedral_domain_with_features ...................   Passed  148.50 sec
      Start 738: execution___of__mesh_polyhedral_domain_with_features
      Start 2806: execution___of__test_mesh_polyhedral_domain_with_features_deprecated
15/18 Test #2806: execution___of__test_mesh_polyhedral_domain_with_features_deprecated ...   Passed    0.20 sec
      Start 2752: Mesh_3_Tests_CleanupFixture
16/18 Test #2752: Mesh_3_Tests_CleanupFixture ............................................   Passed    0.10 sec
17/18 Test #738: execution___of__mesh_polyhedral_domain_with_features ...................   Passed    1.42 sec
      Start 714: Mesh_3_Examples_CleanupFixture
18/18 Test #714: Mesh_3_Examples_CleanupFixture .........................................   Passed    0.01 sec

100% tests passed, 0 tests failed out of 18

Label Time Summary:
Mesh_3_Examples    = 843.66 sec (8 tests)
Mesh_3_Tests       = 567.41 sec (10 tests)

Total Test time (real) = 496.78 sec

The result of CTest is composed of a list of lines Start/Passed, for each test, running in parallel, and a summary at the end.

  • The test entries compilation_of__* are the compilation of the binaries of tests.
  • The test entries executation___of__* are the actual runs of tests.
  • The two lines *_SetupFixture and *_CleanupFixture are special test entries that prepare and clean the execution directory of the tests.

The following run shows that the execution directory of tests is /home/lrineau/Git/cgal/build/examples/Mesh_3/__exec_test_dir.

$ ctest -V -R Mesh_3_Examples_CleanupFixture
[...]
test 714
    Start 714: Mesh_3_Examples_CleanupFixture

714: Test command: /usr/bin/cmake "-E" "remove_directory" "/home/lrineau/Git/cgal/build/examples/Mesh_3/__exec_test_dir"
714: Test timeout computed to be: 1500
1/1 Test #714: Mesh_3_Examples_CleanupFixture ...   Passed    0.01 sec
[...]

Automatic dependencies ensures that a compilation entry is run before the corresponding execution entry, that the SetupFixture is run before the first test, and that the CleanupFixture is run after all the tests have been completed.

Example of a Test Failure

This is an example of a CTest run with a test failure.

The CTest options used in that run are:

  • -j7 to run seven jobs in parallel,
  • -L 'Triangulation_2|Mesh_2' to run only tests with the labels matching the regular expression Triangulation_2|Mesh_2, and that means the tests and examples of Triangulation_2 and Mesh_2,
  • --output-on-failure to see the output of test that fail (use -V instead to see all the outputs)
$ ctest --output-on-failure -L 'Mesh_2|Triangulation_2' -j7
Test project /home/lrineau/Git/cgal/build
[...]
 43/110 Test #1667: compilation_of__voronoi .......................................***Failed   16.09 sec
Built target CGAL
Scanning dependencies of target voronoi
Building CXX object examples/Triangulation_2/CMakeFiles/voronoi.dir/voronoi.cpp.o
/home/lrineau/Git/cgal/Triangulation_2/examples/Triangulation_2/voronoi.cpp: In function ‘int main()’:
/home/lrineau/Git/cgal/Triangulation_2/examples/Triangulation_2/voronoi.cpp:22:24: error: ‘Triangulation {aka class CGAL::Delaunay_triangulation_2<CGAL::Epick>}’ has no member named ‘edge_begin’; did you mean ‘edges_begin’?
   Edge_iterator eit =T.edge_begin();
                        ^~~~~~~~~~
                        edges_begin
gmake[3]: *** [examples/Triangulation_2/CMakeFiles/voronoi.dir/build.make:63: examples/Triangulation_2/CMakeFiles/voronoi.dir/voronoi.cpp.o] Error 1
gmake[2]: *** [CMakeFiles/Makefile2:29539: examples/Triangulation_2/CMakeFiles/voronoi.dir/all] Error 2
gmake[1]: *** [CMakeFiles/Makefile2:29546: examples/Triangulation_2/CMakeFiles/voronoi.dir/rule] Error 2
gmake: *** [Makefile:9925: voronoi] Error 2

MEM: 529812     TIME: 15.99     /usr/bin/cmake --build /home/lrineau/Git/cgal/build --target voronoi

[...]
        Start 1668: execution___of__voronoi
Failed test dependencies: compilation_of__voronoi
 86/110 Test #1668: execution___of__voronoi .......................................***Not Run   0.00 sec
[...]
        Start 3696: Triangulation_2_Tests_CleanupFixture
110/110 Test #3696: Triangulation_2_Tests_CleanupFixture ..........................   Passed    0.01 sec

98% tests passed, 2 tests failed out of 110

Label Time Summary:
Mesh_2_Examples             =  20.40 sec (12 tests)
Mesh_2_Tests                =  43.83 sec (24 tests)
Triangulation_2_Examples    =  84.17 sec (40 tests)
Triangulation_2_Tests       =  63.40 sec (34 tests)

Total Test time (real) =  32.53 sec

The following tests FAILED:
        1667 - compilation_of__voronoi (Failed)
        1668 - execution___of__voronoi (Not Run)
Errors while running CTest

We can see that:

  • the test entry compilation_of__voronoi has failed (because of a compilation error that was inserted on purpose for this tutorial), and that
  • the test entry execution___of__voronoi was not run, because it depends on compilation_of__voronoi.

Checking Headers

A Github action compiles each header separately that is referenced in the manual. This checks that each header makes all necessary #include. It additionally checks for each package Pkg whether Pkg/package_info/Pkg/dependencies corresponds to the packages Pkg depends on.

In order to perform the check locally you need g++ or clang, and you have to CGAL_ENABLE_CHECK_HEADERS in cmake.

make help | grep check shows you all targets, one per package, and the target check_headers to check the headers of all packages.

How To Setup an Automatic Nighlty Testsuite

On Linux, most of our testsuite platforms run on Docker. See the section our Docker images.

On non-Linux platforms, most of our testsuite platforms run with our legacy test script autotest_cgal.

Some of our platforms now use CTest directly to manage the nightly testsuite. See the section Setup a testsuite with ctest.

Internals of the Legacy Testsuite Process

The following sections describe the testsuite process that uses the CGAL-made shell scripts instead of CTest. They are kept here as an explanation of the internals of the testsuite process. If you want to test packages, or all CGAL, locally, you should use CTest instead.

Running the test suite

The test suite is run using the run_testsuite_with_cmake script that is distributed with every internal release in the test/ directory. There are several ways you can customize this script to meet you needs:

  • Add additional compiler and linker flags by setting the variables TESTSUITE_CXXFLAGS and TESTSUITE_LDFLAGS at the top of the script. These variables are prepended to CXX_FLAGS and LDFLAGS, respectively, in the test suite makefiles. So, for example, if you have a directory experimental/include/CGAL containing new or experimental CGAL files, you can do the following: TESTSUITE_CXXFLAGS="-Iexperimental/include" and in this way test with your new files without overwriting the originals.
  • Export additional environment variables by adding lines to the run_testsuite_with_cmake script. As an example, we demonstrate how to export the LD_LIBRARY_PATH.
    • Add the line LD_LIBRARY_PATH=<your library path> to the script.
    • Append LD_LIBRARY_PATH to the line export PLATFORM CGAL_MAKEFILE TESTSUITE_CXXFLAGS TESTSUITE_LDFLAGS in the script. After this, the programs from the test suite will be run using the LD_LIBRARY_PATH that was specified in step 1.
  • Run the test suite on more than one platform by adding a line at the bottom of the script of the form run_testsuite_with_cmake <include makefile> for every platform that you wish to test. Just substitute for <include makefile> the appropriate include makefiles that were generated during installation (don't forget to use the full path name for the makefile). By default, the last line in the file is run_testsuite_with_cmake $CGAL_MAKEFILE so you need not make any changes if you run the testsuite on only one platform and have set the CGAL_MAKEFILE environment variable properly.

After these steps you are ready to run the test suite. It can be run in two different ways:

./run_testsuite_with_cmake

The test suite will run the tests from all test directories. This may take a considerable amount of time.

./run_testsuite_with_cmake <dir1> <dir2> ...

The test suite will run only the test programs in the test directories <dir1> <dir2> ...

To run an entire CGAL test suite automatically, including downloading of an internal release, configuration, and installation of the library, you can use the autotest_cgal script described in Section autotest_cgal.

Files generated by the test suite

The testsuite will generate the following output files:

  • <testdir>/ErrorOutput_<platform> This file contains two lines for every program that was tested on platform <platform> in the test directory <testdir>. The first line tells if the compilation was successful and the second line tells if the execution was successful (i.e. the program returned the value 0) (see Section test for more details).
  • <testdir>/ProgramOutput.<program>.<platform> This file contains the console output from the test program <program.cpp> run on platform <platform>.
  • <testdir>/CompilerOutput_<platform> This file contains the compiler output from platform <platform> for all programs.
  • error.txt This is just a concatenation of all the ErrorOutput files that were generated during the last run of the test suite.

Test suite results

The results of test suites run on the various supported or soon-to-be-supported platforms are posted on the test suite results page.

The results of the tests are presented in a table (y = success, w = warning, n = failure, r = a requirement is not found).

r triggers on the text found in the output of cmake or a testprogram. The regular expression is NOTICE: .*(need|require|incompatible).*and.*will not be

Running the testsuite locally on a single package

Before merging into integration it is good practice to run the testsuite of the package you worked on. Here is what you need:

  • A shell such as bash (install Cygwin, when you are on Windows)
  • optional: put Scripts/scripts and Scripts/developer_scripts on your PATH environment variable. Alternatively, you can call the scripts mentioned below using their full path or a relative path
  • define the environment variable CGAL_DIR. It should be the directory where you built CGAL.
  • optional: define the environment variables for Boost, GMP, and any optional third party lib, e.g. Eigen.
  • On Windows: define the environment variable MAKE_CMD (put the line export MAKE_CMD=nmake in your $HOME/.bashrc for VC++)
  • On Windows: define the environment variable CMAKE_GENERATOR (put the line export CMAKE_GENERATOR='-GNMake Makefiles' in your $HOME/.bashrc for VC++)
  • go in the directory you want to test
  • Run cgal_create_CMakeLists (or ../../../Scripts/scripts/cgal_create_CMakeLists) in case there is not yet a file CMakeLists.txt
  • Run create_cgal_test_with_cmake in case there is not yet a file cgal_test_with_cmake
  • Run cgal_test_with_cmake. This should run CMake, compile and run, and you can see what happened in the generated file error.txt.

Running the test suite on a branch

Note that this section will probably be soon updated to take into account running a testsuite on a branch-build branch.

We describe here how to proceed to the testing of a full copy of master or integration (or the original), by first creating a flat release.

The creation of the flat release is done using the script create_internal_release located in the directory developer_scripts of the package Scripts. Running the script with no argument will give the complete usage of this script. We only describe one way of using it.

The prerequisite is to have a checkout of master or integration In the example, it will be located in ~/Git/cgal/.

First one creates goes into a directory where the flat release will be created.

 > cd /tmp

Then the script create_internal_release is ran

 >  create_internal_release -r CGAL-I-FOO -a ~/Git/cgal/

The directory CGAL-I-FOO now contains the flat release of the branch in ~/Git/cgal/. Then you need to compile this flat release and set CGAL_DIR accordingly. Refer to the installation manual of CGAL to do this.

To run the test-suite simply do:

> cd CGAL-I-FOO/test
> ./run_testsuite_with_cmake

and wait for the results to be written in the file error.txt, or follow the instructions given in Running the test-suite.

Custom test script

In special cases, you may want to provided a custom cgal_test_with_cmake script to fit special needs. The script should rely on four environment variables:

  • $CGAL_MAKEFILE (an included makefile, which must include the full path name!)
  • $PLATFORM (the extension of this makefile, that will be used as an extension to the output files)
  • $TESTSUITE_CXXFLAGS (additional compiler flags)
  • $TESTSUITE_LDFLAGS (additional linker flags)

The latter two flags must be passed to the makefile and they should precede all other compiler and linker flags. The script then performs all tests using the makefile $CGAL_MAKEFILE.

To indicate whether the tests are successful or not, the script writes two one-line messages to a file called error.txt for each target. If something goes wrong during compilation or during execution, an error message is written that starts with the keyword ERROR:; otherwise a message indicating successful compilation or execution is written to this file. Running the script cgal_test_with_cmake must not require any user interaction and the script cleans up after itself by removing object files and executables (usually by using the command make clean).

Automated testing

Additionally to local testing, the compilation and the execution of CGAL programs are also independently and automatically tested

  • when a pull request is opened, using Github Actions;
  • every day, using nightly builds.

Github Actions Continuous Integration

Github Actions is a quick way to check the sanity of a pull request: Some basic cosmetic checks (trailing whitespace, UTF8, ...) are ran and the Polyhedron demo is compiled.

Nightly test suite

Every day, at 9pm, Paris local time, which is either CET (winter) or CEST (summer), a new internal release is created. The branch that is tested by the internal release is not always the same. See also this wiki page for more information.

The results of the test suites are available on the test suite results page. Differences between two test suites can be obtained using this page.

Using the code coverage tool gcov

The tool gcov can be used together with the GNU C++ compiler to test for code coverage in your programs and may be helpful when you create your CGAL test suite programs. You can find a complete guide to this tool in the GNU on-line documentation. If you want to use the code coverage tool gcov, you have to compile your programs with the option --coverage. This generates a file called your_program.gcda. Then you run the program, which generates a file your_program.gcno. Finally, you can run gcov your_program.cpp. This will generate a number of files with the ending .gcov which contain annotated source code. View it in a text editor. Here is a simple example:

       #include<iostream>

       using namespace std;

       void fu(int val)
       {
        int w,v=0;
        if (val==0) {
         cout << "val == 0!\n";
         for(w=0;w<100;w++) v=v+w;
        }
        else {
         cout << "val != 0!\n";
         for(w=0;w<10;w++) v=v+w;  
        }
     
        cout << "v:" << v << "\n";
       }

       int main()
       {
         fu(0);
         return 0;
       }

First you have to compile the example program test.cpp with the special option. Then you have to execute it, and, after this, gcov can be used.

g++ --coverage -o test test.cpp
test
gcov test.cpp

gcov will create a file test.cpp.gcov containing the following output (or very similar):

#include<iostream>

using namespace std;

void fu(int val)
1    {
1     int w,v=0;
1     if (val==0) {
1      cout << "val == 0!\n";
1      for(w=0;w<100;w++) v=v+w;
 }
######     else {
######      cout << "val != 0!\n";
######      for(w=0;w<10;w++) v=v+w;  
}
 
1     cout << "v:" << v << "\n";
}

int main()
1    {
1      fu(0);
1      return 0;
}

The lines that were not executed will be marked with ######, so you will see what should be added in the (CGAL) test suite programs. When doing multiple runs of your program, keep in mind that the execution numbers per line will be accumulated in the gcno file, so delete it if you want to reset them.

CMake

When using CMake, to pass the --coverage flag both to the compiler and to the linker, use a command like this:

cmake -DCGAL_DONT_OVERRIDE_CMAKE_FLAGS=TRUE -DCMAKE_CXX_FLAGS="--coverage" -DCMAKE_EXE_LINKER_FLAGS="--coverage"

Then make and run your program/test. The .gcda and .gcno files are created within the directory CMakeFiles/<executable>.dir/ relative to your project path, which is why you have to invoke gcov like this:

gcov <executable>.cpp -o CMakeFiles/<executable>.dir/<executable>.cpp.gcno

Within CGAL, it is often useful to pass the additional -p option to gcov to preserve the source files' full paths.

Using the CGAL Docker images

Many platforms used to run the daily test suite are now stored on different Docker images. The CGAL project provides many images where all the CGAL dependencies are provided (see https://hub.docker.com/r/cgal/testsuite-docker/tags/ for the list of images).

In order to simplify the resolution of problems in a given platform, it could be useful to install a specific docker image and compile and debug some CGAL program on this image.

For that you need first to install a Docker daemon (see https://www.docker.com/) and be able to run a docker image.

Then let us suppose the CGAL root directory you want to test is ${HOME}/mycgal/ and you want to compile it on the fedora-strict-ansi docker image. This can be done by using the following command in a terminal:

docker run --rm -i -t -v ${HOME}/mycgal/:/mnt/CGAL/ docker.io/cgal/testsuite-docker:fedora-strict-ansi /bin/bash

This will run an interactive bash shell, mounting the ${HOME}/mycgal/ directory as /mnt/CGAL, and downloading the given docker image.

Then you can navigate in this directory and compile and run the different CGAL programs as in your local system.

When finishing, use Ctrl+D to leave the interactive shell and go back to your local system.

Clone this wiki locally