The Perl Toolchain Summit needs more sponsors. If your company depends on Perl, please support this very important event.

NAME

Slay::Makefile::Gress - Use Slay::Makefile for software regression testing

DESCRIPTION

This module provides support for running a set of regression tests from a .t file by doing builds with a Slay::Makefile file.

USAGE

To use this module, the .t file calling it should contain something like:

  use Slay::Makefile::Gress qw(do_tests);
  do_tests("SlayMakefile", @ARGV);

OVERVIEW

The basic functionality of this module is to take an initialization subdirectory (by default ending ".init") and copy it over to a run directory (by default ending ".dir"), doing a chdir into the run directory, and then parsing a Slay::Makefile to define and run the tests. A reasonable way to accomplish this is to have the Slay::Makefile create a file with extension ".ok" that is empty if the test passes; frequently this file can be the output of a diff between the test's output and an expect file. The Slay::Makefile can be in a parent directory so it can be shared by more than one suite of tests (.t file). A shared Slay::Makefile can use the include mechanism to bring in a local Slay::Makefile files in the run directory. You can even use this methodology for developing families of suites of tests, etc.

ROUTINES

do_tests($makefile[, @tests] [, \%options ])

Runs a series of tests using $makefile as the Slay::Makefile input. If @tests are specified, it contains the list of tests, each of which is a target that is built in order; otherwise the dependencies of the test target will be built as the list of tests. The following options are recognized:

init

The extension for the initialization directory. Default is '.init'.

opts

A hash reference to be passed to Slay::Makefile::new as its options list.

pretest

The name of the target to be built prior to running tests to set everything up. Default is 'pretest'.

run

The extension for the run directory. Default is '.run'.

skip

The name for perl scripts to run to check whether all tests should be skipped. The name is also used as an extension for a test's base name to see if an individual tests should be skipped.

test

The name of the target whose dependencies gives the list of tests. Default is 'test'.

PROCESSING

Processing proceeds by the following steps:

  1. Search for an initialization directory with the same base name as the .t file invoking do_tests and extension equal to the init option. Croaks if there is no such directory. For example, if the file invoking c<do_tests> is cmdline.t and the default initialization extension is used, it looks for directory cmdline.init.

  2. Copy the initialization directory to a run directory and cd into that directory.

  3. Check for a script with the name of the skip option. If it exists, execute it. If it returns a non-zero exit code, skip all the tests. The text this script prints becomes the reason for skipping the tests.

  4. Use Slay::Makefile to parse the $makefile file. Note that the working directory is the run directory when this file is processed.

  5. Do a Slay::Makefile::make of the pretest target, if it exists. The name of the pretest target is 'pretest' unless specified in the options.

  6. If @tests is empty, create a list of tests to execute by getting the dependencies of the test target. The name of the test target is 'test' unless specified in the options.

  7. For each test t,

    a.

    Check for a script with the same base name as t and the extension . and the name of the skip option.. For example, if the default value of the skip option is used, then a test algebra.ok would use a script called algebra.skip.pl. If the script exists, execute it and skip the test if it returns a non-zero exit code. The text this script prints becomes the reason for skipping the test.

    b.

    Run Slay::Makefile::make for target t.

    c.

    If no file t was generated, report a failed test as failing to build the file. If t was generated, then it should be empty for a passing test. Any text in the file is returned as the reason for the tests's failure.