wtools
ESOwaftools
|
wtools is a library that extends waf with helpers and implementation of a lot of default features. Specifically, it allows a user to declare a waf project and corresponding modules in a simplified way.
wtools has built-in support for
A project is made up of modules in an optional hierarchy of packages. Wtools is using fully qualified names (FQN) for modules and packages which is defined as dot separated path from the root leading to the package or module. This means that a project can have leaf nodes with the same name, e.g. a project could have the modules foo.bar
and baz.bar
without collisions.
Examples:
./foo/ # Module: Name=foo ./bar/ # Package: Name=bar ./bar/baz # Package: Name=bar.baz ./bar/baz/foo # Module: Name=bar.baz.foo
Note: When declaring dependencies to a module with the use
variable the FQN of the module must be used.
For setting up a new project c.f. New project.
waf has built-in help shown with the –help option:
$ waf --help
Before a package can be built it needs to be configured. In this step all external dependencies to the package are found.
$ waf configure [options]
To specify a build-time location use the -b <path>
option:
$ waf configure -b /path/to/build
The prefix can also be specified without -b <path>
option by setting the PREFIX
environment variable. But note that if -b
is specified, it takes precedence.
An important option parameter that can be passed to configure is the build mode specified with --mode <buildmode>
that sets the project in one of the supported modes, currently being:
debug
: (default) all the artefacts are prepared with debugging information enabledrelease
: all the artefacts are prepared with release build optimizationscoverage
: all the artefacts are prepared with coverage instrumentation and tests are run with coverage instrumentation enabled$ waf build [options]
or simply
$ waf
Note: Vanilla waf
also runs unit tests as part of the build
command but when using wtools
this has been changed so no tests are executed during build
. The wtools
equivalent to waf build
is waf test
which build and runs the unit tests.
$ waf test
Additional flags can be passed by the user to the tests if needed. The flags depend on the programming language used as different test runners are used for different languages and they may not be understood generically by all, making the test runner fail. Also most likely passing flags has mostly sense when running the command on a module level, as certain flags (ie. test filtering ones) make sense only on a module level. The currently implemented variables are:
Specifically for C++ Unit tests: CPPTESTOPT
, ie:
$ CPPTESTOPT='–gmock_verbose=info –gtest_filter=Test1' waf test
Specifically for Java Unit tests: JAVATESTOPT
, ie:
$ JAVATESTOPT='-groups pippo' waf test
Specifically for Python Unit tests: PYTESTOPT
, ie:
$ PYTESTOPT='–verbosity=42' waf test
To run tests with memory leak checking pass --valgrind
option as well
$ waf test --valgrind
Running with –valgrind will generate XML files containing valgrind findings. These files can then be examined and published in a CI environment. The files have the same name as the test executed with a .valgrind
file extension.
It is also possible to pass additional flags to the valgrind execution using the VALGRINDOPT
environment variable. The passed flags will be appended to the standard options.
$ VALGRINDOPT='--track-origins=yes' waf test --valgrind
By default successful tests are not run again until inputs changes. To force an execution of all tests run with the argument --alltests
.
$ waf test --alltests
Note that the test
command will also update the necessary binaries, like the build
command so there is no need to run build
first. I.e. run waf test
instead of waf build test
.
When tests are run and the configuration mode has been set to coverage
additional information about test code coverage will be generated. The generated information depends on the programming language of the artifact:
C/C++
: artefacts have been built with GCOV and as a consequence when executed the executables will generate .gcno\.gcda
files that can be examined with tools such as gcov
or gcovr
present in the development environment. wtools will also automatically execute the gcovr
tool at the end of the test runs to generate HTML coverage reports that are placed in a file named coverage.html
present in the test execution directory.Python
: tests, executed via nosetests
, will be instructed to generate coverage information and store them in the build directories in both an XML file named coverage.xml
and in a HTML hierarchy under the subdirectory cover/
Java
: tests, executed via TestNG
, will be additionally executed with JaCoCo as agent. This will generate a jacoco.exec
binary file that contains the coverage information. Additionally wtools will automatically convert the binary information to HTML reports using jacococli
and place them in the test execution directory under a directory called jacoco
.When tests are run in coverage mode an additional parameter coverage_opt
(a list of strings) to the module is taken into account and its contents are added to the coverage test run. How these additional parameters are passed is language and test runner specific.
Example for Java:
The two strings in the coverage_opt
list will be appended, comma separated, to the JaCoCo agent invocation.
$ waf lint
Runs the linter tools at the current level.
By default the linter tool will not run for up to date targets until inputs changes. To force an execution of all tests run with option --lintall
:
$ waf lint --lintall
Note that the lint
command will also update the necessary binaries, like the build
command so there is no need to run build
first. I.e. run waf lint
instead of waf build lint
.
$ waf check
Executes all code verification commands. (Currently waf test
and waf lint
)
Note that the same arguments for those commands can be used here.
$ waf eclipse
Generate the Eclipse IDE project and support files from waf wscripts.
This simplifies the usage of the Eclipse IDE by automatically adding various waf calls (to configure, build and clean for example) and by automatically configuring search paths for dependencies.
Support files for C/C++, Python and Java are generated. The execution of the command will overwrite previously present Eclipse configuration files. The command can be run multiple times if the source tree changes.
$ waf install
Install the built artefacts to a destination directory structure. The default destination directory is /usr/local
. The destination directory can be changed by defining the $PREFIX
variable at the configuration step:
$ PREFIX=/home/user/my/destination/dir waf configure
In the destination directory a UNIX-like directory structure will be created (bin/
for binaries, lib64/
for 64-bit libraries, lib/python-3.5
for Python modules, include/
for C/C++ includes and so on) and populated with the respective artefacts generated during the build phase.
To effectively use the artefacts from the destination directory the user should set, most likely via LMOD configuration, the correct variables in the environment to point to this directory structure. These variables include for example:
PATH
, to find the executables without having to specify the whole path to the binary, should include $PREFIX/bin
LD_LIBRARY_PATH
, to find the dynamically linked libraries, should include $PREFIX/lib64
PYTHONPATH
, to find the Python modules, should include $PREFIX/lib/python-3.5/site-packages
CLASSPATH
, to find the generated JARs, should include $PREFIX/lib
and eventually explicitly the JAR files there installed$ waf docs
Will generate the doxygen documentation for the project. If a file doxy.conf
is found in the root of the project tree then this will be used as the configuration to doxygen. If a configuration file is not found then one will be generated automatically with a standard set of configuration options. The file can be customized and eventually stored together with the project for the future.
There are four variants of build scripts when using wtools
.
Project declaration
Top level script that declares the project using wtools.project.declare_project.
Package declaration (optional)
Intermediate namespace levels that group modules by using wtools.package.declare_package.
Module declaration
Module level build script that declare what type of primary artefact to create by using wtools.module.
The structure for a package looks like this:
At the root the top-level wscript declares the package and directories to recurse into
. `-- wscript
Module directories contain a single wscript
in the module root level:
[<package>] |-- <module> | `-- wscript # module wscript |-- ... `-- wscript # package or recurse wscript
Recursion directories can be used to collect modules in logical groups, and can appear between the top level and module.
top `-- <recursion> # optional recursion directory |-- <recursion> # Which can contain other recursion directories |-- <module> # or module(s) `-- wscript # recursion wscript
The top-level wscript is used to define project name, version, required features and location of all project's modules and packages. For this purpose, the declare_project method is used. It's also a place where changes to configure method should be made. The following code is an example of a project declaration:
In the aforementioned example, there is configure method declared that defines a set of external dependencies. More about dependencies can be found in the Dependencies section. It's important to notice that the configure definition in the top level script is placed before the project declaration (done with declare_project
as explained before) otherwise the wtools internal one will be fully overridden and not just augmented.
See declare_package.
The name of the primary deliverable, the "target" of a module must be declared by the user since the fully qualified name of the module can be very unfriendly to interact with as e.g. a command line tool.
The following module types are supported
Name | Description |
---|---|
declare_cprogram | C/C++ program |
declare_cshlib | C/C++ shared library |
declare_cstlib | C/C++ static library |
declare_qt5cprogram | Qt5 C/C++ program |
declare_qt5cshlib | Qt5 C++ library |
declare_cprotobuf | C++ protobuf shared (default) or static library |
declare_crtidds | C++ RTI-DDS shared (default) or static library |
declare_jar | Java Archive (jar) library |
declare_jrtidds | Java RTI-DDS (jar) library |
declare_jprotobuf | Java protobuf (jar) library |
declare_config | Configuration only module |
declare_pyprogram | Python program |
declare_pypackage | Python package |
declare_pyqt5program | Python program with Qt5 usage |
declare_pyqt5package | Python package with Qt5 usage |
declare_pyprotobuf | Python protobuf module |
declare_custom | Custom module (build function must be defined by hand) |
declare_malicd | MAL ICD module |
declare_malicd_topics | MAL ICD Topics module |
There are two kinds of dependencies that can be tracked by wtools: internal project dependencies and external dependencies. External dependencies can be third-party libraries or modules from other wtools projects.
Internal dependencies are described on module level with the argument use
, which can be a space-separated list of fully qualified module names (c.f. Module and packages) like 'packag1.foo bar'
, or a Python list ['package1.foo', 'bar']
.
The following is an example where a C/C++ program fooBar
has a dependency on the library bazBar
in the same source tree.
In this case waf
can figure out what to do with the dependency since it knows what it is (because it's in the source tree).
If the dependency is outside the source tree the information about it comes from different possible sources. There are three ways to declare those dependencies in the configure
phase.
The first mechanism: check is designed for C and C++. The developer declares a package dependency in the configure method on project level script, before project definition using check_cfg
. The uselib_store
parameter is used to specify the use-name used by the modules which wants to use this dependency.
Then the dependency can be used in the module like in the example:
The example resembles what the old automatic dependencies were doing and it requests the C and library flags for cfitsio
external package and stores them in the use
name cfitsio
.
The mandatory
parameter specifies if this is a strict configuration requisite or not. If mandatory=False
and if the package is not found the configuration step will still succeed. With mandatory=True
, which is the default, it would fail.
Since wscript
is effectively Python code, to check multiple dependencies a more compact solution for a list of dependencies can be used. For example, to check for multiple C++ libraries something like the following example can be used:
The second mechanism is provided by wtools, and it allows to declare C/C++, Java and Python dependencies. The dependency description is kept in wdep file. The wdep file contains information needed to consume dependency. By default, wtools looks for wdep files in /usr/share/wdep
. It's possible to specify the search path by specifying WDEPPATH
environment variable. To define a dependency, the check_wdep
method is used in a similar way like check
methods described in the previous subsection. The dependency should be declared in the configure method on project level script, before project definition. For example:
commons
. If uselib_store is not provided, then the uppercased module name will be used as a name. The mandatory
parameter indicates if dependency is optional or not, default is mandatory=True
.It's possible to generate wdep files for a module, so then it can be easily consumed by other projects. To generate a wdep file, the wdep
feature needs to be added in the module declaration. The feature is automatically added for following module types:
During the installation step, wdep files are installed in ${PREFIX}/share/wdep
directory.
Note: This is not recommended or best practice, because the paths are stored in the project and might not be the same when the same project is built on another host.
It is also possible to declare waf environment variables to make dependency available during the build. These are not system environment variables (as in $LD_LIBRARY_PATH
) but the internal environment used by waf.
For example:
use
name of PROTOBUF
(what follows CLASSPATH_
).jacocoagent.jar
in a list of directories and set it in the variable so it can be used with the use
name JACOCO
.To recurse from one set of directories to the next without declaring anything in the current level use wtools.recurse module.
Recurse takes a list or space separated string of patterns:
Attribute | Type | Meaning |
---|---|---|
includes | strings/list | Specifies private include directories relative to the current script. |
export_includes | strings/list | Specifies public include directories relative to the current script. |
defines | strings/list | Specifies macro definition as 'KEY=VALUE' pairs. |
cxxflags | strings/list | Specifies C++ compiler flags '-Wall' . |
cflags | strings/list | Like cxxflags but for C. |
See the waf documentation for more attributes: https://waf.io/book/.
'foo bar'
is equivalent to ['foo', 'bar']
.Example: Change of include directory
wtools will almost always provide sane defaults if you follow the standard conventions. However, the user can provide their own attributes in cases where it's necessary to customize this behaviour.
One example is the case when a VLTSW module is also used in ELT using different build systems. The user can then provide customized locations for where headers are located. For example, the VLTSW module structure have both public and private headers in the directory <root>/include
, not separated as specified for wtools where public headers are located in <src>/include
and private headers in <src>
. In this case we simply override the defaults and tell wtools where the headers are located. The export_includes
attribute is used to forward the include directory to dependent modules as well as taking care of installing them.