wtools  3.2.0-pre1
ESO waf tools
Main Page

Introduction

wtools is a library that extends Waf build system with additional tools and introduces a concept of project, package, and module. This chapter consists of basic information about Waf and a high-level description of how it's functionality is extended by wtools.

Basics about Waf

Waf is an open-source build system written in Python. The Waf software can be also considered as a framework for creating more specialized build systems. Main features provided by Waf are:

  • Running compilers and programs as distinct processes.
  • Tracking code changes and running build of only affected sources.
  • Computing build order and running processes in parallel.
  • Tracking external dependencies.
  • Designed for extensibility. Support for new languages and development tools can be added as extensions.

The documentation of Waf is available in the form of Waf Book and API docs. In the subjective opinion of authors of this document Waf documentation doesn't cover all important topics and very often doesn't help with finding a solution to a problem. Many times an effective way of finding answers and solutions is just looking into the source code of Waf. Fortunately for version 2.0.x Waf framework sources has only about 5000 lines of code which has a good level of readability and is well commented.

Waf is general-purpose build system/framework and it's not tied to any language or ecosystem. Waf comes with a set of extensions called tools, that provide support for most common languages like C/C++, Java, and Python. The tools usage documentation can be found in source code and examples can be found in demos and playground folders of Waf source tree. There is also a set of tools that are not yet part of official Waf distribution and are in evaluation state, those tools can be found in extras folder.

From a user perspective, there are two important concepts introduced by Waf: the command and the script. The Waf command indicates an action that the user want to perform on the project. Waf provides some fundamental and predefined commands, the most important are:

  • configure - is used to configure an environment for build execution. At this step projects external dependencies and programs that are required for the build (like compilers) are located. It's mandatory to perform the configure step before the build step. Result of the configure step is persistent and the configure step is required to be re-executed only if there is a change in the dependency list or project structure.
  • build - is used to execute compilers or generators to execute sources build into targets. Waf is tracing sources and targets, so if there are changes to source code only affected targets will be rebuilt. By default Waf will also try to execute a build in parallel if it's possible.
  • install - makes build results available for usage. Usually it means coping of targets into designated locations of a file system. By convention, installation location can be adjusted by setting of PREFIX variable on the configuration step. Specific modules can be defined local, so visible only locally to the project and not installed for general consumption, by setting the install_path attribute explicitly to None.
  • clean - is used to remove files and targets created during the build. What is worth to mention is that clean step doesn't remove installed targets and result of the configure step.
  • distclean - is used to remove all files and targets created during build and result of the configure step.

User is allowed to create new Waf commands and to extend or replace existing ones. More details and information about Waf commands can be found in chapter Usage.

The second important concept is Waf script called wscript. Wscript is a Python module that can contain any Python code, and Waf can recognize and use specific classes and functions defined in them. The goal of wscript is to hold project-specific details. Usually wscript is stored and versioned together with source code of the project and it's a file called wscript. Comprehensive guide how to write wscripts can be found in Waf Book and information on how to write wscripts with wtools extension can be found in Writing wscripts chapter.

High-level overview on wtools

Waf provides generic build framework and gives a lot of freedom in a way how a project is structured and builds executed. This flexibility in Waf usage can bring additional complexity and confusion within the project. The goal of wtools is to constrain the project structure and hide some of the Waf's complexity. The main feature introduced by wtools is the concept of project, package and module. This concept defines the tree structure of the project with flowing types of nodes:

  • project - it's a root node and parent for all other nodes. There must be exactly one project node in the tree. The project wscript is called top-level and it's a place where a user sets projects name, version and required features. Feature is a term introduced by wtools and it's just a set of tools that will be loaded by wtools. Top-level wscript is also a place where a user should put all settings for configure step. More about project declaration can be found in Project declaration section.
  • module - modules are leafs of the tree. They contain projects source code and tests. There are many types of modules like cshlib for c/c++ shared libraries, or pyprogram for Python programs. Modules provides a unified interface to build programs and libraries that are written in different languages. There could be one or more modules in the project tree, and project can contain modules of many types. More about modules can be found in Module declaration section.
  • package - it's a node whose goal is to encapsulate a group of modules and other packages. They can be considered as a branches of the tree. Packages don't contain any code or tests. Package should contain one or more dependent package or module. Correct usage of packages avoid naming conflicts and make navigating through the project tree easier. More about packages can be found in Package declaration section.

The directory tree should correspond to the project tree, where each node is a subdirectory. The naming convention for projects, packages and modules is that name has to start with a letter and then can contain letters, numbers, hyphen "-" or underscore"_" chars.

Example of project tree:

WTOOLS PROJECT TREE
example_project
/ \
/ \
/ \
/ \
module_a package_foo
/ \
/ \
/ \
/ \
module_b package_bar
/ \
/ \
/ \
/
module_c module_d

And corresponding example of directory tree:

DIRECTORY TREE
example_project/
├── module_a
└── package_foo
├── module_b
└── package_bar
├── module_c
└── module_d

Beside introducing the concepts of project, package and modules, wtools provides also additional tools and commands. The most important commands introduced by wtools are:

  • test - it's used to execute tests,
  • lint - it's used to execute static code analysis,
  • eclipse - it's used to generate Eclipse IDE project.

Troubleshooting tips

Waf isn't a mainstream build system. Sometimes it's not possible to find answers or solutions for a problem in the Internet. In that case the user is on his own and some troubleshooting knowledge can be useful.

Configure step details

Many times it's useful to check results of configure step. Configure log is stored in build/config.log file. Information about environment variables can be found in build/c4ache/_cache.py file. Please note that build directory can be found on project level node.

Logging level

It's possible to increase Waf verbosity by using -v, -vv or -vvv options.

$ waf -v

It's also possible to enable debug logging for a specific zone:

$ waf --zones=wtools,build

Information about available zones can be found in Logging section of the Waf book. Additionally wtools introduce the wtools zone.

Listing targets (taskgens)

It is possible to list all the defined targets which can be used to verify if a particular wscript script has been parsed correctly which results in the expected targets being generated.

$ waf list

Source code

Sometimes it can be very useful to take a look at Waf and wtools source code.

Version used

When trying to understand or reporting an issue it is very useful to report the version of waf and wtools used. This can be done with the --wtools-version option, for example:

$ waf --wtools-version
waf 2.0.19 (e83405712e95b47c040763fdfa468c04dfe72e4b)
wtools 1.0.6-dev+19327

Frequently asked questions

It could happen that the problem is common and an answer can be found in Frequently Asked Questions chapter of this document.

New project

For setting up a new project c.f. New project.

Usage

waf has built-in help shown with the –help option:

$ waf --help

Configuring

Before a package can be built it needs to be configured. In this step all external dependencies to the package are found.

$ waf configure [options]

To specify a build-time location use the -b <path> option:

$ waf configure -b /path/to/build

The prefix can also be specified without -b <path> option by setting the PREFIX environment variable. But note that if -b is specified, it takes precedence.

A series of environment variables are supported by waf to set common options for the whole project, for example the C++ compiler flags (variable CXXFLAGS), linker flags (LDFLAGS) or include paths (INCLUDES). These values can then be overridden per module in the project using task attributes customization. A list of such environment variables, specific to the C/C++ language, can be found in Table 1 of the waf book. All these variables are taken into account at the waf configuration stage, therefore any changes to those needs to be followed by a call to waf configure to get effectively used.

An important option parameter that can be passed to configure is the build mode specified with --mode <buildmode> that sets the project in one of the supported modes, currently being:

  • debug: (default) all the artefacts are prepared with debugging information enabled
  • release: all the artefacts are prepared with release build optimizations
  • coverage: all the artefacts are prepared with coverage instrumentation, tests are run with coverage instrumentation enabled and a coverage report is created for each test.

For C/C++ modules build with various type of sanitizers is supported using the --sanitize=<SANITIZER> option. More than one sanitizer can be specified at the same time by adding them separated by a comma. Some sanitizers combinations may not be supported by the compiler (ie. gcc does not support using thread and address at the same time) but this is not checked by wtools as it is compiler specific.

$ waf configure --sanitize=address


$ waf configure --sanitize=address,leak,undefined

Currently supported sanitizers are:

  • address
  • leak
  • thread
  • undefined

Sanitizing is done by default on all C/C++ modules if this option is passed at configure time. To exclude explicitly sanitizer usage on a module a boolean option sanitize is available on module level that can be set to False to disable sanitation.

Runtime options to various sanitizers can be passed via environment variables (see SanitizerCommonFlags and following per sanitizer type specific pages) when running unit tests or the artefacts in any other way.

Building

$ waf build [options]

or simply

$ waf

Note: Vanilla waf also runs unit tests as part of the build command but when using wtools this has been changed so no tests are executed during build. The wtools equivalent to waf build is waf test which build and runs the unit tests.

Running tests

$ waf test

Additional flags can be passed by the user to the tests if needed. The flags depend on the programming language used as different test runners are used for different languages and they may not be understood generically by all, making the test runner fail. Also most likely passing flags has mostly sense when running the command on a module level, as certain flags (ie. test filtering ones) make sense only on a module level. The currently implemented variables are:

  • Specifically for C++ Unit tests: CPPTESTOPT, ie:

    $ CPPTESTOPT='–gmock_verbose=info –gtest_filter=Test1' waf test

  • Specifically for C++ Unit tests using catch2 unit test runner CATCH2TOXML will tell catch2 to store the test results in a Junit compatible XML file instead of standard output (this is due to a current limitation of catch2 to support multiple output streams), ie:

    $ CATCH2TOXML=1 waf test

  • Specifically for Java Unit tests: JAVATESTOPT, ie:

    $ JAVATESTOPT='-groups pippo' waf test

  • Specifically for Python Unit tests: PYTESTOPT, ie:

    $ PYTESTOPT='–verbosity=42' waf test

For some languages (currently C++) multiple test runners are supported by wtools and they can be selected by specifying the attribute with_ut_lib to the specific module. Currently supported test frameworks are:

  • C++:
    • Google test (with_ut_lib=gtest)
    • Qt test (with_ut_lib=qt5test)
    • Catch2 (with_ut_lib=catch2)
    • Google test with Google benchmark (with_ut_lib=gbench)
  • Python:
    • nosetests (with_ut_lib=nosetests)
    • pytest (with_ut_lib=pytest)
  • Java:
    • TestNG (always default)

It is important to notice that in recent wtools versions the test frameworks used must be explicitly set in the requires attribute of the project definition.

If the unit test task requires specific attributes which are not exactly the same as the primary artifact, as for example additional libraries are needed only for the unit test and not for the actual module (speaking in waf the use for the test task is different from the one for the build task), these can be overridden for all the languages by passing as ut_attrs a dictionary containing the attributes that the test task wants to override, for example:

uses = '...'
declare_cshlib(..., use=uses, ut_attrs=dict(use=uses + ' extraTestLib'))

The Python pytest test runner supports also passing a custom configuration file to further customize the options provided by this tool. This can be either passed at project level via the pytest_config dict entry in the python attribute (either a relative path from the project root or an absolute path) or at module level (relative path from module root) using the pytest_config attribute.

To run tests with memory leak checking pass --valgrind option as well

$ waf test --valgrind

Running with –valgrind will generate XML files containing valgrind findings. These files can then be examined and published in a CI environment. The files have the same name as the test executed with a .valgrind file extension.

It is also possible to pass additional flags to the valgrind execution using the VALGRINDOPT environment variable. The passed flags will be appended to the standard options.

$ VALGRINDOPT='--track-origins=yes' waf test --valgrind

Valgrind suppression files (see also: https://wiki.wxwidgets.org/Valgrind_Suppression_File_Howto) usage is supported by passing to the module declare a list of suppression files via the valgrind_suppression attribute. The files are either defined with absolute paths, if they start with the OS path separator, or are otherwise relative to the wscript file. This enables usage of shared, project or system wide, valgrind suppression files as well as local ones.

declare_cshlib(...,
valgrind_suppression = ["test/local.supp",
"/elt/mal/share/mal.supp",
"../othermod/test/othermod.supp"])

By default successful tests are not run again until inputs changes. To force an execution of all tests run with the argument --alltests.

$ waf test --alltests

Note that the test command will also update the necessary binaries, like the build command so there is no need to run build first. I.e. run waf test instead of waf build test.

Java tests are run through the TestNG test framework. A wrapper has been created to redirect the standard output and standard error of TestNG execution also into two files, namely testng-stdout.txt and testng-stderr.txt in each test subdirectory, to support test debugging. To disable this behavior and have the standard output and error not redirected the user can just set the environment variable ETNGNOREDIR to some value. If the user wants to only redirect to a file without having prints also to the standard output, the environment variable ETNGNOOUTREDIR can be set to some value.

Running tests in coverage mode

When tests are run and the configuration mode has been set to coverage additional information about test code coverage will be generated. The generated information depends on the programming language of the artifact:

  • C/C++: artefacts have been built with GCOV and as a consequence when executed the executables will generate .gcno\.gcda files that can be examined with tools such as gcov or gcovr present in the development environment. wtools will also automatically execute the gcovr tool at the end of the test runs to generate HTML coverage reports that are placed in a file by default named coverage.html present in the test execution directory. Note, due to the limitation of gcovr, execution of unit tests may contribute to coverage outside the module, which may may introduce indeterminism.
  • Python: tests, executed via nosetests, will be instructed to generate coverage information and store them in the build directories in both an XML file named coverage.xml and in a HTML hierarchy under the subdirectory cover/
  • Java: tests, executed via TestNG, will be additionally executed with JaCoCo as agent. This will generate a jacoco.exec binary file that contains the coverage information. Additionally wtools will automatically convert the binary information to HTML reports using jacococli and place them in the test execution directory under a directory called jacoco.

When tests are run in coverage mode an additional parameter coverage_opt (a list of strings) to the module is taken into account and its contents are added to the coverage test run. How these additional parameters are passed is language and test runner specific.

Example for Java:

declare_jar(target='jarEx', manifest='src/manifest', coverage_opt=['excludes=**/Identifiers', 'sessionid=12345666'])

The two strings in the coverage_opt list will be appended, comma separated, to the JaCoCo agent invocation.

Specifically for running tests in C/C++, using therefore gcovr to generate output, additional command line options are available:

  • --gcovr-no-generated: to not include generated code in results (by default they are included)
  • --gcovr-html-medium: to set the threshold for visualization in HTML reports for the medium level (default: 75)
  • --gcovr-html-high: to set the threshold for visualization in HTML reports for the high level (default: 90)

The same options can be also set on a project level by adding them as a parameter in the declare_project of the top-level wscript in the cxx options, for example:

declare_project(...
cxx = dict( ... gcovr_no_generated = True, gcovr_html_medium = '42.3;, gcovr_html_high = '69.7'),
...)

Running lint

$ waf lint

Runs linter tools at the current level of the project tree. Different linter tools are executed for different programming languages:

Linter tool Programming language
clang-tidy C/C++
checkstyle Java
pylint Python

Linter tool is run against module sources and test sources.

By default, linter tools use configuration created for the ESO ELT project. To use a different configuration file, options --clang-tidy-config, --checkstyle-config, and --pylint-config can be used during the project configuration phase. The Absolute or relative path to the configuration file should be used as a parameter. Please note that in case of the clang-tidy configuration file should be in YAML format. For example:

$ waf configure --clang-tidy-config=./alt_clang-tidy.yml

It is also possible to set configuration files and other options for linter tools, by passing appropriate arguments to wtools.project.declare_project method. For example:

wtools.project.declare_project('waf-test', '0.1-dev',
recurse='cpp java python',
requires='cxx java python qt5 pyqt5 boost',
boost=dict(
libs='program_options',
force_static=False,
),
cxx=dict(
clang_tidy_config='./alt_clang-tidy.yml',
),
java=dict(
checkstyle_config='./alt_checks.xml'
),
python=dict(
pylint_config='./alt_pylintrc'
)
)

By default the linter tool will not run for up to date targets until inputs changes. To force an execution of all tests run with option --lintall:

$ waf lint --lintall

Note that the lint command will also update the necessary binaries, like the build command so there is no need to run build first. I.e. run waf lint instead of waf build lint.

Additional configuration to the clang-tidy tool

The clang-tidy tool is by default configured to include diagnostics from headers in the same module as the source file being linted (using the includes attribute). This behavior may be changed by setting the regular expression for the header filter manually with the attributes clang_tidy_header_filter and/or clang_tidy_line_filter when declaring the module. These parameters will be passed directly as --header-filter and --line-filter options to the clang-tidy tool. More information about the usage of these options may be found in clang-tidy documentation.

Running all code verification commands

$ waf check

Executes all code verification commands. (Currently waf test and waf lint)

Note that the same arguments for those commands can be used here.

Generating Eclipse IDE projects

$ waf eclipse

Generate the Eclipse IDE project and support files from waf wscripts.

This simplifies the usage of the Eclipse IDE by automatically adding various waf calls (to configure, build and clean for example) and by automatically configuring search paths for dependencies.

Support files for C/C++, Python and Java are generated. The execution of the command will overwrite previously present Eclipse configuration files. The command can be run multiple times if the source tree changes.

Installing files

$ waf install

Install the built artefacts to a destination directory structure. The default destination directory is /usr/local. The destination directory can be changed by defining the $PREFIX variable at the configuration step:

$ PREFIX=/home/user/my/destination/dir waf configure

In the destination directory a UNIX-like directory structure will be created (bin/ for binaries, lib64/ for 64-bit libraries, lib/python-3.5 for Python modules, include/ for C/C++ includes and so on) and populated with the respective artefacts generated during the build phase.

To effectively use the artefacts from the destination directory the user should set, most likely via LMOD configuration, the correct variables in the environment to point to this directory structure. These variables include for example:

  • PATH, to find the executables without having to specify the whole path to the binary, should include $PREFIX/bin
  • LD_LIBRARY_PATH, to find the dynamically linked libraries, should include $PREFIX/lib64
  • PYTHONPATH, to find the Python modules, should include $PREFIX/lib/python-3.5/site-packages
  • CLASSPATH, to find the generated JARs, should include $PREFIX/lib and eventually explicitly the JAR files there installed

Generating documentation

$ waf build --with-docs

Will generate the doxygen documentation for the project and build sphinx documentation modules. If a file doxy.conf is found in the root of the project tree then this will be used as the configuration to doxygen. If a configuration file is not found then one will be generated automatically with a standard set of configuration options. The file can be customized and eventually stored together with the project for the future.

In a similar way to install the documentation:

$ waf install --with-docs

Some options can be passed from project declaration to the Doxygen execution via a dictionary, as per example below:

wtools.project.declare_project('waf-test', '0.1-dev',
recurse='cpp java python',
doxygen=dict(
install_path='${DOCDIR}/custom/docs'
)
)

Current configurable options are:

  • install_path, directory where to install the configuration (default $DOCDIR/doxygen)

Additionally a configuration option --with-sphinx-formats can be used to filter the sphinx formats/builders that are executed, overriding the definitions in the respective declare_sphinx() modules. This can be useful to avoid generating certain types of documentation that require tools not available on the system, for example generating PDF files which requires the whole LaTeX environment (ie. --with-sphinx-formats=html,man will allow, if requested by any module, to generate html and man formats but not others, for example latexpdf).

Auditing dependencies

$ waf audit
$ waf audit --target=java.jarExPb

Will audit dependencies in the project. This means that on the output all the defined dependencies, internal or external, will be printed as well as their status according to the audit: either ok or not found. When found an indication of where the dependency comes from will be also printed:

  • use_store, an explicit definition as a configuration option (usually therefore an external dependency)
  • taskgen, a dynamically generated task in the build process of the current project

Seeing a not found should trigger the user to consider if a dependency is missing, due for example to a typo or a missing configuration in the project level wscript. If the whole project build, passes tests and works even with a not found dependency then it is possible that it is redundant or the needed files are somehow picked up in a system directory.

An example output may look like:

Auditing use names for "java.jarEx-test" declared in "/root/wtools/svn/wtools/test/robot/project/java/jarEx"
Checking use name `fofofof`      : not found (possible error)
Checking use name `java.jarEx`   : ok (taskgen)
Checking use name `JAVATEST`     : ok (use store)
-----------------------------------------------------------------------------------------------------------
Auditing use names for "java.jarExPb" declared in "/root/wtools/svn/wtools/test/robot/project/java/jarExPb"
Checking use name `PROTOBUF`   : ok (use store)
-----------------------------------------------------------------------------------------------------------------
Auditing use names for "java.jarExPbDep" declared in "/root/wtools/svn/wtools/test/robot/project/java/jarExPbDep"
Checking use name `java.jarExPb`   : ok (taskgen)
Checking use name `PROTOBUF`       : ok (use store)

In the first module we can see that a missing dependecy fofofof is highlighted, while the other dependencies are all satisfied either by external configuration definition (PROTOBUF) or by internal dependencies.

Waf debugging shell

$ waf shell --targets=proj.level.mod

Spawns a shell that contains an environment very close to the one waf is executing tasks.

This is useful to debug by hand failing tests or programs, without the need to manually set up the environment (ie. various PATH variables pointing to the correct dependencies) so stuff can effectively run.

Taskgens to be used are passed via –targets (separated by comma if multiple). If not passed the shell will include all the taskgens available at the current source structure (ie. module or package where waf shell was run)

Features that would like to expose something in waf shell should put that information in their taskgen and then wafshell can retrieve and handle that (see linting example where the lint tool puts the lint command lines in the tg and wafshell will print them if available)

In the executed wafshell, the tool will provide some bash function to ease access to certain most used operation for the requested taskgens, such as:

  • Executing unit test. This is done via the bash function UT_RUN_{taskgen} which will execute the unit tests for the given taskgen in a waf-like environment
  • Executing linting. This is done via the bash function LINT_RUN_{taskgen} which will execute the linting operations for the given taskgen with waf-like parameters. Note: this bash function receives as parameters the files (or directories) to lint.

Enable build caching

Build time executions can be greatly improved by using the wafcache extra that tries to cache every artefact in a waf build, reducing the needed rebuilds between different users and projects, should waf understand that the artefact inputs are exactly identical. The cache can be either on a local file system or in the cloud, supporting services such as AWS S3, Google cloud and MinIO S3.

The wafcache extra tool can be enabled in wtools simply by exporting the environment variable WAFCACHE_ENABLE to some value (and disabled by unsetting it) during the configuration stage.

A series of other environment variables defined in the standard wafcache extra define the behaviour of the cache, for example if it is on a local filesystem or remote. Such variables are defined in the wafcache extra, see this link for the documentation.

The default is to write the cache file in the user directory so be aware of this as in some cases (ie. shared NFS mount) this may not be very performant. Also the default cache sizes are rather generous and could create problems with small partitions or mounts with quota limitation, so be aware of them and set the parameters accordingly.

Again: you must review the WAFCACHE variable values to have performance improvements and not create problems on your system. Using a personal and/or system wide wafcache LMOD module file can be a good way to simplify and unify such settings.

wafcache usage statistics can be displayed after each waf run by setting the WAFCACHE_STATS environment variable to some value. In this case the statistics will be displayed after the execution of the command:

wafcache stats: requests: 42, hits, 36, ratio: 85.71%, writes 6

Writing wscripts

There are four variants of build scripts when using wtools.

  1. Project declaration

    Top level script that declares the project using wtools.project.declare_project.

  2. Package declaration (optional)

    Intermediate namespace levels that group modules by using wtools.package.declare_package.

  3. Module declaration

    Module level build script that declare what type of primary artefact to create by using wtools.module.

  4. Recursing And finally optional levels between top and module which simply recurses to the next levels by using wtools.recurse without declaring a package namespace level.

The structure for a package looks like this:

At the root the top-level wscript declares the package and directories to recurse into

    .
    `-- wscript

Module directories contain a single wscript in the module root level:

    [<package>]
    |-- <module>
    |   `-- wscript     # module wscript
    |-- ...
    `-- wscript         # package or recurse wscript

Recursion directories can be used to collect modules in logical groups, and can appear between the top level and module.

    top
    `-- <recursion>         # optional recursion directory
        |-- <recursion>     # Which can contain other recursion directories
        |-- <module>        # or module(s)
        `-- wscript         # recursion wscript

Project declaration

The top-level wscript is used to define project name, version, required features and location of all project's modules and packages. For this purpose, the declare_project method is used. It's also a place where changes to configure method should be made. Please note for project name following convention has to be applied: the name has to start with a letter and then can contain letters, numbers, hyphen "-" or underscore"_" chars.The following code is an example of a project declaration:

# Top level wscript
from wtools.project import declare_project
def configure(cnf):
"""Configure external dependencies.
"""
pkgs = 'CCfits cfitsio hiredis ifw_sia libzmq protobuf xerces-c yaml-cpp'
for pkg in pkgs.split():
cnf.check_cfg(package=pkg, uselib_store=pkg, args='--cflags --libs')
cnf.check(lib='m', cflags='-Wall', defines=['var=foo'], uselib_store='m', mandatory=False)
cnf.check_wdep(wdep_name='apache.commons', uselib_store='commons', mandatory=True)
cnf.check_wdep(wdep_name='mal.base', uselib_store='base')
cnf.check_wdep(wdep_name='mal.base-cxx-zpb', uselib_store='cxx-zpb')
cnf.env.CLASSPATH_JACOCO = cnf.find_file('jacocoagent.jar', ['/opt/jacoco/lib/', '/usr/local/lib', '/opt/java/jars'])
declare_project(name='project-name',
version='1.0-dev',
requires='cxx qt5 python java gtest testng nosetests', # required features the project needs
recurse='pkg1 pkg2') # Recurse into pkg1 and pkg2

In the aforementioned example, there is configure method declared that defines a set of external dependencies. More about dependencies can be found in the Dependencies section. It's important to notice that the configure definition in the top level script is placed before the project declaration (done with declare_project as explained before) otherwise the wtools internal one will be fully overridden and not just augmented.

Package declaration

# Package level wscript
from wtools import package
package.declare_package(recurse='*')

See declare_package. Please note for package name following convention has to be applied: the name has to start with a letter and then can contain letters, numbers, hyphen "-" or underscore"_" chars.

Module declaration

The name of the primary deliverable, the "target" of a module must be declared by the user since the fully qualified name of the module can be very unfriendly to interact with as e.g. a command line tool.

# Module level wscript
from wtools import module
module.declare_cprogram(target='example')

The following module types are supported

Name Description
declare_cprogram C/C++ program
declare_cshlib C/C++ shared library
declare_cstlib C/C++ static library
declare_clib C/C++ shared and static library
declare_cobjects Collection of C/C++ source and/or header files
declare_qt5cprogram Qt5 C/C++ program
declare_qt5cshlib Qt5 C++ library
declare_cprotobuf C++ protobuf shared (default) or static library
declare_cfastdds C++ Fast DDS shared (default) or static library
declare_jar Java Archive (jar) library
declare_jprotobuf Java protobuf (jar) library
declare_config Configuration only module
declare_pyprogram Python program
declare_pypackage Python package
declare_pyqt5program Python program with Qt5 usage
declare_pyqt5package Python package with Qt5 usage
declare_pyprotobuf Python protobuf module
declare_custom Custom module (build function must be defined by hand)
declare_malicd MAL ICD module
declare_malicd_topics MAL ICD Topics module
declare_sphinx Sphinx documentation module

Wtools is using fully qualified names (FQN) for modules and packages which is defined as dot separated path from the root leading to the package or module. This means that a project can have leaf nodes with the same name, e.g. a project could have the modules foo.bar and baz.bar without collisions.

Examples:

./foo/        # Module:  Name=foo
./bar/        # Package: Name=bar
./bar/baz     # Package: Name=bar.baz
./bar/baz/foo # Module:  Name=bar.baz.foo

Note:

  • When declaring dependencies to a module with the use variable the FQN of the module must be used.
  • The module name has to start with a letter and then can contain letters, numbers, hyphen "-" or underscore"_" chars.

Dependencies

There are two kinds of dependencies that can be tracked by wtools: internal project dependencies and external dependencies. External dependencies can be third-party libraries or modules from other wtools projects.

Project internal dependencies

Internal dependencies are described on module level with the argument use, which can be a space-separated list of fully qualified module names (c.f. Module and packages) like 'package1.foo bar', or a Python list ['package1.foo', 'bar'].

The following is an example where a C/C++ program fooBar has a dependency on the library bazBar in the same source tree.

# fooBar/wscript
from wtools import module
module.declare_cprogram(target='fooBar', use='bazBar')
# bazBar/wscript
from wtools import module
module.declare_cshlib(target='bazBar')

In this case waf can figure out what to do with the dependency since it knows what it is (because it's in the source tree).

External dependencies

If the dependency is outside the source tree the information about it comes from different possible sources. There are three ways to declare those dependencies in the configure phase.

Pkg-config method for C/C++

The first mechanism: check is designed for C and C++. The developer declares a package dependency in the configure method on project level script, before project definition using check_cfg. The uselib_store parameter is used to specify the use-name used by the modules which wants to use this dependency.

# Top level wscript
from wtools.project import declare_project
def configure(cnf):
"""Configure external dependencies.
"""
cnf.check_cfg(package='cfitsio', uselib_store='cfitsio', args='--cflags --libs', mandatory=False)
declare_project(name='project-name',
version='1.0-dev',
requires='cxx'
...)

Then the dependency can be used in the module like in the example:

# fooBar/wscript
from wtools import module
module.declare_cprogram(target='fooBar', use='cfitsio m')

The example resembles what the old automatic dependencies were doing and it requests the C and library flags for cfitsio external package and stores them in the use name cfitsio.

The mandatory parameter specifies if this is a strict configuration requisite or not. If mandatory=False and if the package is not found the configuration step will still succeed. With mandatory=True, which is the default, it would fail.

Since wscript is effectively Python code, to check multiple dependencies a more compact solution for a list of dependencies can be used. For example, to check for multiple C++ libraries something like the following example can be used:

from wtools.project import declare_project
def configure(cnf):
pkgs = 'CCfits cfitsio hiredis libczmq libzmq protobuf xerces-c yaml-cpp'
for pkg in pkgs.split():
cnf.check_cfg(package=pkg, uselib_store=pkg, args='--cflags --libs')
declare_project(name='project-name',
version='1.0-dev',
...)

Wdep files for C/C++, Java and Python

The second mechanism is provided by wtools, and it allows to declare C/C++, Java and Python dependencies. The dependency description is kept in wdep file. The wdep file contains information needed to consume dependency. It's possible to specify the wdep files search path by specifying WDEPPATH environment variable. To define a dependency, the check_wdep method is used in a similar way like check methods described in the previous subsection. The dependency should be declared in the configure method on project level script, before project definition. For example:

# Top level wscript
from wtools.project import declare_project
def configure(cnf):
"""Configure external dependencies.
"""
# 1.
cnf.check_wdep(wdep_name='commons.libio', uselib_store='commons', mandatory=True)
project.declare_project(name='project-name',
version='1.0-dev',
requires='java'
...)
# Module
from wtools import module
module.declare_jar(target='fooBar', use='commons')
  1. in the example wtools, in configuration step, will look for file commons.libio.wdep and configure the project to consume the dependency. The uselib_store parameter indicates that results will be stored in the name commons. If uselib_store is not provided, then the uppercased module name will be used as a name. The mandatory parameter indicates if dependency is optional or not, default is mandatory=True.

It's possible to generate wdep files for a module, so then it can be easily consumed by other projects. To generate a wdep file, the wdep feature needs to be added in the module declaration. The feature is automatically added for following module types:

  • cshlib,
  • cstlib,
  • qt5shlib,
  • jar
  • pypackage.

During the installation step, wdep files are installed in ${PREFIX}/share/wdep directory.

Defining environment variables

Note: This is not recommended or best practice, because the paths are stored in the project and might not be the same when the same project is built on another host.

It is also possible to declare waf environment variables to make dependency available during the build. These are not system environment variables (as in $LD_LIBRARY_PATH) but the internal environment used by waf.

For example:

# Top level wscript
from wtools.project import declare_project
def configure(cnf):
"""Configure external dependencies.
"""
# 1.
cnf.env.CLASSPATH_PROTOBUF='/opt/protobuf-3.2.0/java/protobuf-java-3.2.0.jar'
# 2.
cnf.env.CLASSPATH_JACOCO = cnf.find_file('jacocoagent.jar', ['/opt/jacoco/lib/', '/usr/local/lib', '/opt/java/jars'])
project.declare_project(name='project-name',
version='1.0-dev',
requires='java'
...)
# Module
from wtools import module
module.declare_jar(target='fooBar', use='PROTOBUF JACOCO')
  1. in the example defines a classpath to a JAR on the system for a use name of PROTOBUF (what follows CLASSPATH_).
  2. will search for a specific JAR named jacocoagent.jar in a list of directories and set it in the variable so it can be used with the use name JACOCO.

Recursing

To recurse from one set of directories to the next without declaring anything in the current level use wtools.recurse module.

Recurse takes a list or space separated string of patterns:

from wtools import recurse
recurse.recurse('*')

Attributes customization

Attribute Type Meaning
includes strings/list Specifies private include directories relative to the current script.
export_includes strings/list Specifies public include directories relative to the current script.
defines strings/list Specifies macro definition as 'KEY=VALUE' pairs.
cxxflags strings/list Specifies C++ compiler flags '-Wall'.
cflags strings/list Like cxxflags but for C.
install_path string/None Path where to install artefacts or None to request not to install anything (ie. internal module)

See the waf documentation for more attributes: https://waf.io/book/.

Note
strings/list can either be a space-separated string or a Python list of strings. E.g. 'foo bar' is equivalent to ['foo', 'bar'] .

Example: Change of include directory

wtools will almost always provide sane defaults if you follow the standard conventions. However, the user can provide their own attributes in cases where it's necessary to customize this behavior.

One example is the case when a VLTSW module is also used in ELT using different build systems. The user can then provide customized locations for where headers are located. For example, the VLTSW module structure have both public and private headers in the directory <root>/include, not separated as specified for wtools where public headers are located in <src>/include and private headers in <src>. In this case we simply override the defaults and tell wtools where the headers are located. The export_includes attribute is used to forward the include directory to dependent modules as well as taking care of installing them.

# bazBar/wscript
from wtools import module
module.declare_cshlib(target='bazBar',
includes='includes',
export_includes='includes')

Frequently Asked Questions

My build randomly fails on install step.

Sometimes this happens that two or more modules try to install the same file. You can use waf -v option on the install step to check if that's the case.

$ waf install -v
+ install ...
....
* Node /INTROOT/include/m1uiDesign.h is created more than once (full message on 'waf -v -v'). The task generators are:
1. '' in /wtools/test/robot/project/cpp_pkg/mymodule/mymoduleQtlib
2. '' in /wtools/test/robot/project/cpp_pkg/mymodule/mymoduleQtplugin
If you think that this is an error, set no_errcheck_out on the task instance

If you get a similar output, you should double-check that you don't create a file with the same name and path from two different modules.

Is it allowed to build subtree of the project?

Yes, it's allowed to build, lint, test or install a subtree of the project.

Is it allowed to configure subtree of the project?

No, it's not allowed to configure a subtree of the project. Configuration settings should be put in top-level wscript and configure command can be executed only on the project level node for the whole tree.

Recent versions of wtools will also notify this, should you anyway try to do it:

error: not in project level directory: have you run 'waf configure' inside a module or package?

I customized the configure/build method but it seems that is is ignored.

Make sure that the configure/_build_ method is defined before the call to declare_XYZ in the wscript and not after.

I get errors on execution of built programs after installation, what happened?

To exemplify one could get from a C++ artefact something like:

./MyCoolProgram: error while loading shared libraries: libsuperUtils.so: cannot open shared object file: No such file or directory

While building and executing unit tests inside the waf framework, waf will take care of setting the needed environment variables, like LD_LIBRARY_PATH, CLASSPATH and PYTHON_PATH, depending on the defined dependencies in the wscripts.

But once you install your artefacts you have to take care of setting them (by hand or via LMOD scripts) so your executables will be able to retrieve all the necessary dependencies.

Is there a way to instruct waf to do the BUILD not inside the project directory?

Straight out of waf –help:

-o OUT, --out=OUT build dir for the project

So remeber: waf –help is your friend!

I added a configure method but waf does not seem to use it.

Did you rerun waf configure on your project level?

What are the automatic dependency warnings at configure stage?

When you run waf configure on your project you get at the end something like:

Automatic dependency 'foobar.bazbar.simFooBarBazLib' : not found
Project relies on automatic dependency search!
Automatic dependency usage is deprecated (ELTDEV-14) and will be removed in the future!
Please explicity check for dependencies in project configuration
Warning at least one automatic dependency is not satisfied!
Check the lines starting with "Automatic dependency" in the above logs.
Build may fail or be incomplete!
'configure' finished successfully (36.666s)

The warning is telling you that in one of your wscripts you are trying to set a dependency to foobar.bazbar.simFooBarBazLib but this module is not found by waf. Therefore if this is a vital dependency then the further steps of your build or test execution may fail as this dependency may be missing if it is not brought in in some other way, for example via a system library import mechanism (ie. setting by hand the PYTHON_PATH).

The configuration will still succeed for compatibility with older releases of wtools but this will be removed in the future as stated by the warning message and described in the referenced ticket.

What you should do is: check where the dependency is defined (ie. by searching with grep for the occourence of that use) and make sure there isn't for example a typo in the dependency name or that the dependency has not moved (either inside the project structure or in another project, that now needs to be imported in your environment). In short you should make sure the dependency you are claiming to be using is indeed satisfied by your setup.

Additionally it is highly recommented to use the --audit command line to audit dependencies, as described in the Auditing dependencies section.

Why do a get import errors on source files while execution Python tests?

While running the tests, with waf test, of a Python module it may happen that you may get a test failure due to missing imports that may confuse you as the import is refering to files in source and not in test code. The output may for example look like:

[29/29] Processing utest: src/fcfclib/ActuatorSetup.py src/fcfclib/AdcSetup.py src/fcfclib/DevMgrSetup.py src/fcfclib/DrotSetup.py src/fcfclib/LampSetup.py src/fcfclib/MotorSetup.py src/fcfclib/PiezoSetup.py src/fcfclib/SetupCommand.py src/fcfclib/ShutterSetup.py src/fcfclib/_init_.py test/test_setupcommand.py
Waf: Leaving directory `/home/user/ifw-hl/fcf/build'
Output from test "/home/user/ifw-hl/fcf/devmgr_cii/fcfclib/src/fcfclib/ActuatorSetup.py" (exitcode: 1)
Failure: ImportError (generic_type: type "FutureString" referenced unknown base type "elt::mal::bindings::FutureBaseDelegate") ... ERROR

Where the ImportError may refer to something inside a file contained in the src directory and not a test.

The explanation is two-fold:

  1. When unit tests are run that contain doctests (enabled by default for all Python modules except pyqt5 ones) then the test runner will search, and therefore try to import, also all the Python files under src for tests. If you are not using doctests you may want to disable them passing with_doctest=False to your module.
  2. Nevertheless the test code does automatically put a use dependency to the source module. Therefore if your test will import some module from the source it may still trigger the same problem indirectly if the test is then using the source part with the specific import.

All in all, this behaviour is usually an indication that most likely the module is missing a use dependency (either a dependency to another module in the same project or an external project) that would be in worse case show up once the project is deployed.

It is therefore highly recommended to doublecheck for missing definition of use dependencies for the module.

The build consumes a lot of memory, makes the machine swap and it is slow. What can I do?

The waf build system tries to optimize the build for speed and by default will spawn a number of parallel tasks equal to the number of CPUs on the machine. So for example if the machine has 8 CPUs, waf will execute up to 8 tasks (compilations, linkings, code generation, documentation and so on) at the same time.

If each of these tasks consume a lot of memory, running them in parallel may cause the system to run out of memory and start accessing swap which is usually much slower and this will have a negative impact on the overall time of the build execution. Typical tasks where the compilation takes huge amounts of memory is for example the pybind11 usage for Python bindings for C++, which is heavily used as of now for example in CII/MAL. The build system of course cannot know beforehand how much resources a specific task may be using.

Should such a situation occour the user can instruct waf to limit the number of parallel tasks to execute using the -j <value> command line flag, where a value of 1 means of course that a single operation is done at a time. The optimal value strongly depends on the number of CPUs, amount of RAM and type of operations that are requested to be done to the build system and cannot therefore be universally suggested.

Glossary

Term Description
Tool It's a Waf extension that usually brings a new functionality or support for a new language.
Target It's a result file or files of the build process.
Taskgen or task generator For each target a task generator object is created. Task generators encapsulate the creation process of tasks that are later executed during the build.
Feature In wtools context is a set of tools. In Waf context, it's a system of extending behavior of task generators.
Command Is an action that a user performs during the build process. For example, configure.
wtools.project
Definition: project.py:1
wtools.project.declare_project
def declare_project(name, version, recurse, requires, **kwargs)
Definition: project.py:197