ctest_test¶
Perform the CTest Test Step as a Dashboard Client.
ctest_test([BUILD <build-dir>] [APPEND]
[START <start-number>]
[END <end-number>]
[STRIDE <stride-number>]
[EXCLUDE <exclude-regex>]
[INCLUDE <include-regex>]
[EXCLUDE_LABEL <label-exclude-regex>]
[INCLUDE_LABEL <label-include-regex>]
[EXCLUDE_FROM_FILE <filename>]
[INCLUDE_FROM_FILE <filename>]
[EXCLUDE_FIXTURE <regex>]
[EXCLUDE_FIXTURE_SETUP <regex>]
[EXCLUDE_FIXTURE_CLEANUP <regex>]
[PARALLEL_LEVEL [<level>]]
[RESOURCE_SPEC_FILE <file>]
[TEST_LOAD <threshold>]
[SCHEDULE_RANDOM <ON|OFF>]
[STOP_ON_FAILURE]
[STOP_TIME <time-of-day>]
[RETURN_VALUE <result-var>]
[CAPTURE_CMAKE_ERROR <result-var>]
[REPEAT <mode>:<n>]
[OUTPUT_JUNIT <file>]
[QUIET]
)
Run tests in the project build tree and store results in
Test.xml
for submission with the ctest_submit()
command.
The options are:
BUILD <build-dir>
Specify the top-level build directory. If not given, the
CTEST_BINARY_DIRECTORY
variable is used.APPEND
Mark
Test.xml
for append to results previously submitted to a dashboard server since the lastctest_start()
call. Append semantics are defined by the dashboard server in use. This does not cause results to be appended to a.xml
file produced by a previous call to this command.START <start-number>
Specify the beginning of a range of test numbers.
END <end-number>
Specify the end of a range of test numbers.
STRIDE <stride-number>
Specify the stride by which to step across a range of test numbers.
EXCLUDE <exclude-regex>
Specify a regular expression matching test names to exclude.
INCLUDE <include-regex>
Specify a regular expression matching test names to include. Tests not matching this expression are excluded.
EXCLUDE_LABEL <label-exclude-regex>
Specify a regular expression matching test labels to exclude.
INCLUDE_LABEL <label-include-regex>
Specify a regular expression matching test labels to include. Tests not matching this expression are excluded.
EXCLUDE_FROM_FILE <filename>
Added in version 3.29.
Do NOT run tests listed with their exact name in the given file.
INCLUDE_FROM_FILE <filename>
Added in version 3.29.
Only run the tests listed with their exact name in the given file.
EXCLUDE_FIXTURE <regex>
Added in version 3.7.
If a test in the set of tests to be executed requires a particular fixture, that fixture's setup and cleanup tests would normally be added to the test set automatically. This option prevents adding setup or cleanup tests for fixtures matching the
<regex>
. Note that all other fixture behavior is retained, including test dependencies and skipping tests that have fixture setup tests that fail.EXCLUDE_FIXTURE_SETUP <regex>
Added in version 3.7.
Same as
EXCLUDE_FIXTURE
except only matching setup tests are excluded.EXCLUDE_FIXTURE_CLEANUP <regex>
Added in version 3.7.
Same as
EXCLUDE_FIXTURE
except only matching cleanup tests are excluded.PARALLEL_LEVEL [<level>]
Run tests in parallel, limited to a given level of parallelism.
Added in version 3.29: The
<level>
may be omitted, or0
, to let ctest use a default level of parallelism, or unbounded parallelism, respectively, as documented by thectest --parallel
option.RESOURCE_SPEC_FILE <file>
Added in version 3.16.
Specify a resource specification file. See Resource Allocation for more information.
TEST_LOAD <threshold>
Added in version 3.4.
While running tests in parallel, try not to start tests when they may cause the CPU load to pass above a given threshold. If not specified the
CTEST_TEST_LOAD
variable will be checked, and then the--test-load
command-line argument toctest(1)
. See also theTestLoad
setting in the CTest Test Step.REPEAT <mode>:<n>
Added in version 3.17.
Run tests repeatedly based on the given
<mode>
up to<n>
times. The modes are:UNTIL_FAIL
Require each test to run
<n>
times without failing in order to pass. This is useful in finding sporadic failures in test cases.UNTIL_PASS
Allow each test to run up to
<n>
times in order to pass. Repeats tests if they fail for any reason. This is useful in tolerating sporadic failures in test cases.AFTER_TIMEOUT
Allow each test to run up to
<n>
times in order to pass. Repeats tests only if they timeout. This is useful in tolerating sporadic timeouts in test cases on busy machines.
SCHEDULE_RANDOM <ON|OFF>
Launch tests in a random order. This may be useful for detecting implicit test dependencies.
STOP_ON_FAILURE
Added in version 3.18.
Stop the execution of the tests once one has failed.
STOP_TIME <time-of-day>
Specify a time of day at which the tests should all stop running.
RETURN_VALUE <result-var>
Store in the
<result-var>
variable0
if all tests passed. Store non-zero if anything went wrong.CAPTURE_CMAKE_ERROR <result-var>
Added in version 3.7.
Store in the
<result-var>
variable -1 if there are any errors running the command and prevent ctest from returning non-zero if an error occurs.OUTPUT_JUNIT <file>
Added in version 3.21.
Write test results to
<file>
in JUnit XML format. If<file>
is a relative path, it will be placed in the build directory. If<file>
already exists, it will be overwritten. Note that the resulting JUnit XML file is not uploaded to CDash because it would be redundant with CTest'sTest.xml
file.QUIET
Added in version 3.3.
Suppress any CTest-specific non-error messages that would have otherwise been printed to the console. Output from the underlying test command is not affected. Summary info detailing the percentage of passing tests is also unaffected by the
QUIET
option.
See also the CTEST_CUSTOM_MAXIMUM_PASSED_TEST_OUTPUT_SIZE
,
CTEST_CUSTOM_MAXIMUM_FAILED_TEST_OUTPUT_SIZE
and
CTEST_CUSTOM_TEST_OUTPUT_TRUNCATION
variables, along with their
corresponding ctest(1)
command line options
--test-output-size-passed
,
--test-output-size-failed
, and
--test-output-truncation
.
Additional Test Measurements¶
CTest can parse the output of your tests for extra measurements to report to CDash.
When run as a Dashboard Client, CTest will include these custom
measurements in the Test.xml
file that gets uploaded to CDash.
Check the CDash test measurement documentation for more information on the types of test measurements that CDash recognizes.
The following example demonstrates how to output a variety of custom test measurements.
std::cout <<
"<CTestMeasurement type=\"numeric/double\" name=\"score\">28.3</CTestMeasurement>"
<< std::endl;
std::cout <<
"<CTestMeasurement type=\"text/string\" name=\"color\">red</CTestMeasurement>"
<< std::endl;
std::cout <<
"<CTestMeasurement type=\"text/link\" name=\"CMake URL\">https://cmake.org</CTestMeasurement>"
<< std::endl;
std::cout <<
"<CTestMeasurement type=\"text/preformatted\" name=\"Console Output\">" <<
"line 1.\n" <<
" \033[31;1m line 2. Bold red, and indented!\033[0;0ml\n" <<
"line 3. Not bold or indented...\n" <<
"</CTestMeasurement>" << std::endl;
Image Measurements¶
The following example demonstrates how to upload test images to CDash.
std::cout <<
"<CTestMeasurementFile type=\"image/jpg\" name=\"TestImage\">" <<
"/dir/to/test_img.jpg</CTestMeasurementFile>" << std::endl;
std::cout <<
"<CTestMeasurementFile type=\"image/gif\" name=\"ValidImage\">" <<
"/dir/to/valid_img.gif</CTestMeasurementFile>" << std::endl;
std::cout <<
"<CTestMeasurementFile type=\"image/png\" name=\"AlgoResult\">" <<
"/dir/to/img.png</CTestMeasurementFile>"
<< std::endl;
Images will be displayed together in an interactive comparison mode on CDash if they are provided with two or more of the following names.
TestImage
ValidImage
BaselineImage
DifferenceImage2
By convention, TestImage
is the image generated by your test, and
ValidImage
(or BaselineImage
) is basis of comparison used to determine
if the test passed or failed.
If another image name is used it will be displayed by CDash as a static image separate from the interactive comparison UI.
Attached Files¶
Added in version 3.21.
The following example demonstrates how to upload non-image files to CDash.
std::cout <<
"<CTestMeasurementFile type=\"file\" name=\"TestInputData1\">" <<
"/dir/to/data1.csv</CTestMeasurementFile>\n" <<
"<CTestMeasurementFile type=\"file\" name=\"TestInputData2\">" <<
"/dir/to/data2.csv</CTestMeasurementFile>" << std::endl;
If the name of the file to upload is known at configure time, you can use the
ATTACHED_FILES
or ATTACHED_FILES_ON_FAIL
test
properties instead.
Custom Details¶
Added in version 3.21.
The following example demonstrates how to specify a custom value for the
Test Details
field displayed on CDash.
std::cout <<
"<CTestDetails>My Custom Details Value</CTestDetails>" << std::endl;
Additional Labels¶
Added in version 3.22.
The following example demonstrates how to add additional labels to a test at runtime.
std::cout <<
"<CTestLabel>Custom Label 1</CTestLabel>\n" <<
"<CTestLabel>Custom Label 2</CTestLabel>" << std::endl;
Use the LABELS
test property instead for labels that can be
determined at configure time.