wtools
ESO waf tools
|
wtools is a library that extends Waf build system with additional tools and introduces a concept of project, package, and module. This chapter consists of basic information about Waf and a high-level description of how it's functionality is extended by wtools.
Waf is an open-source build system written in Python. The Waf software can be also considered as a framework for creating more specialized build systems. Main features provided by Waf are:
The documentation of Waf is available in the form of Waf Book and API docs. In the subjective opinion of authors of this document Waf documentation doesn't cover all important topics and very often doesn't help with finding a solution to a problem. Many times an effective way of finding answers and solutions is just looking into the source code of Waf. Fortunately for version 2.0.x Waf framework sources has only about 5000 lines of code which has a good level of readability and is well commented.
Waf is general-purpose build system/framework and it's not tied to any language or ecosystem. Waf comes with a set of extensions called tools, that provide support for most common languages like C/C++, Java, and Python. The tools usage documentation can be found in source code and examples can be found in demos and playground folders of Waf source tree. There is also a set of tools that are not yet part of official Waf distribution and are in evaluation state, those tools can be found in extras folder.
From a user perspective, there are two important concepts introduced by Waf: the command and the script. The Waf command indicates an action that the user want to perform on the project. Waf provides some fundamental and predefined commands, the most important are:
User is allowed to create new Waf commands and to extend or replace existing ones. More details and information about Waf commands can be found in chapter Usage.
The second important concept is Waf script called wscript. Wscript is a Python module that can contain any Python code, and Waf can recognize and use specific classes and functions defined in them. The goal of wscript is to hold project-specific details. Usually wscript is stored and versioned together with source code of the project and it's a file called wscript. Comprehensive guide how to write wscripts can be found in Waf Book and information on how to write wscripts with wtools extension can be found in Writing wscripts chapter.
Waf provides generic build framework and gives a lot of freedom in a way how a project is structured and builds executed. This flexibility in Waf usage can bring additional complexity and confusion within the project. The goal of wtools is to constrain the project structure and hide some of the Waf's complexity. The main feature introduced by wtools is the concept of project, package and module. This concept defines the tree structure of the project with flowing types of nodes:
The directory tree should correspond to the project tree, where each node is a subdirectory. The naming convention for projects, packages and modules is that name has to start with a letter and then can contain letters, numbers, hyphen "-" or underscore"_" chars.
Example of project tree:
And corresponding example of directory tree:
Beside introducing the concepts of project, package and modules, wtools provides also additional tools and commands. The most important commands introduced by wtools are:
Waf isn't a mainstream build system. Sometimes it's not possible to find answers or solutions for a problem in the Internet. In that case the user is on his own and some troubleshooting knowledge can be useful.
Configure step details
Many times it's useful to check results of configure step. Configure log is stored in build/config.log file. Information about environment variables can be found in build/c4ache/_cache.py file. Please note that build directory can be found on project level node.
Logging level
It's possible to increase Waf verbosity by using -v, -vv or -vvv options.
It's also possible to enable debug logging for a specific zone:
Information about available zones can be found in Logging section of the Waf book. Additionally wtools introduce the wtools zone.
Listing targets (taskgens)
It is possible to list all the defined targets which can be used to verify if a particular wscript script has been parsed correctly which results in the expected targets being generated.
Source code
Sometimes it can be very useful to take a look at Waf and wtools source code.
Version used
When trying to understand or reporting an issue it is very useful to report the version of waf and wtools used. This can be done with the --wtools-version
option, for example:
Frequently asked questions
It could happen that the problem is common and an answer can be found in Frequently Asked Questions chapter of this document.
For setting up a new project c.f. New project.
waf has built-in help shown with the –help option:
$ waf --help
Before a package can be built it needs to be configured. In this step all external dependencies to the package are found.
$ waf configure [options]
To specify a build-time location use the -b <path>
option:
$ waf configure -b /path/to/build
The prefix can also be specified without -b <path>
option by setting the PREFIX
environment variable. But note that if -b
is specified, it takes precedence.
An important option parameter that can be passed to configure is the build mode specified with --mode <buildmode>
that sets the project in one of the supported modes, currently being:
debug
: (default) all the artefacts are prepared with debugging information enabledrelease
: all the artefacts are prepared with release build optimizationscoverage
: all the artefacts are prepared with coverage instrumentation, tests are run with coverage instrumentation enabled and a coverage report is created for each test.For C/C++ modules build with various type of sanitizers is supported using the --sanitize=<SANITIZER>
option. More than one sanitizer can be specified at the same time by adding them separated by a comma. Some sanitizers combinations may not be supported by the compiler (ie. gcc does not support using thread
and address
at the same time) but this is not checked by wtools as it is compiler specific.
$ waf configure --sanitize=address $ waf configure --sanitize=address,leak,undefined
Currently supported sanitizers are:
address
leak
thread
undefined
Sanitizing is done by default on all C/C++ modules if this option is passed at configure time. To exclude explicitly sanitizer usage on a module a boolean option sanitize
is available on module level that can be set to False
to disable sanitation.Runtime options to various sanitizers can be passed via environment variables (see SanitizerCommonFlags and following per sanitizer type specific pages) when running unit tests or the artefacts in any other way.
$ waf build [options]
or simply
$ waf
Note: Vanilla waf
also runs unit tests as part of the build
command but when using wtools
this has been changed so no tests are executed during build
. The wtools
equivalent to waf build
is waf test
which build and runs the unit tests.
$ waf test
Additional flags can be passed by the user to the tests if needed. The flags depend on the programming language used as different test runners are used for different languages and they may not be understood generically by all, making the test runner fail. Also most likely passing flags has mostly sense when running the command on a module level, as certain flags (ie. test filtering ones) make sense only on a module level. The currently implemented variables are:
Specifically for C++ Unit tests: CPPTESTOPT
, ie:
$ CPPTESTOPT='–gmock_verbose=info –gtest_filter=Test1' waf test
Specifically for C++ Unit tests using catch2
unit test runner CATCH2TOXML will tell catch2 to store the test results in a Junit compatible XML file instead of standard output (this is due to a current limitation of catch2 to support multiple output streams), ie:
$ CATCH2TOXML=1 waf test
Specifically for Java Unit tests: JAVATESTOPT
, ie:
$ JAVATESTOPT='-groups pippo' waf test
Specifically for Python Unit tests: PYTESTOPT
, ie:
$ PYTESTOPT='–verbosity=42' waf test
For some languages (currently C++) multiple test runners are supported by wtools and they can be selected by specifying the attribute with_ut_lib
to the specific module. Currently supported test frameworks are:
with_ut_lib=gtest
)with_ut_lib=qt5test
)with_ut_lib=catch2
)with_ut_lib=gbench
)with_ut_lib=nosetests
)with_ut_lib=pytest
)It is important to notice that in recent wtools versions the test frameworks used must be explicitly set in the requires
attribute of the project definition.
If the unit test task requires specific attributes which are not exactly the same as the primary artifact, as for example additional libraries are needed only for the unit test and not for the actual module (speaking in waf
the use
for the test task
is different from the one for the build task
), these can be overridden for all the languages by passing as ut_attrs
a dictionary containing the attributes that the test task wants to override, for example:
To run tests with memory leak checking pass --valgrind
option as well
$ waf test --valgrind
Running with –valgrind will generate XML files containing valgrind findings. These files can then be examined and published in a CI environment. The files have the same name as the test executed with a .valgrind
file extension.
It is also possible to pass additional flags to the valgrind execution using the VALGRINDOPT
environment variable. The passed flags will be appended to the standard options.
$ VALGRINDOPT='--track-origins=yes' waf test --valgrind
By default successful tests are not run again until inputs changes. To force an execution of all tests run with the argument --alltests
.
$ waf test --alltests
Note that the test
command will also update the necessary binaries, like the build
command so there is no need to run build
first. I.e. run waf test
instead of waf build test
.
Java tests are run through the TestNG test framework. A wrapper has been created to redirect the standard output and standard error of TestNG execution into two files, namely testng-stdout.txt
and testng-stderr.txt
in each test subdirectory, to support test debugging. To disable this behavior and have the standard output and error not redirected the user can just set the environment variable ETNGNOREDIR
to some value.
When tests are run and the configuration mode has been set to coverage
additional information about test code coverage will be generated. The generated information depends on the programming language of the artifact:
C/C++
: artefacts have been built with GCOV and as a consequence when executed the executables will generate .gcno\.gcda
files that can be examined with tools such as gcov
or gcovr
present in the development environment. wtools will also automatically execute the gcovr
tool at the end of the test runs to generate HTML coverage reports that are placed in a file by default named coverage.html
present in the test execution directory. Note, due to the limitation of gcovr, execution of unit tests may contribute to coverage outside the module, which may may introduce indeterminism.Python
: tests, executed via nosetests
, will be instructed to generate coverage information and store them in the build directories in both an XML file named coverage.xml
and in a HTML hierarchy under the subdirectory cover/
Java
: tests, executed via TestNG
, will be additionally executed with JaCoCo as agent. This will generate a jacoco.exec
binary file that contains the coverage information. Additionally wtools will automatically convert the binary information to HTML reports using jacococli
and place them in the test execution directory under a directory called jacoco
.When tests are run in coverage mode an additional parameter coverage_opt
(a list of strings) to the module is taken into account and its contents are added to the coverage test run. How these additional parameters are passed is language and test runner specific.
Example for Java:
The two strings in the coverage_opt
list will be appended, comma separated, to the JaCoCo agent invocation.
$ waf lint
Runs linter tools at the current level of the project tree. Different linter tools are executed for different programming languages:
Linter tool | Programming language |
---|---|
clang-tidy | C/C++ |
checkstyle | Java |
pylint | Python |
Linter tool is run against module sources and test sources.
By default, linter tools use configuration created for the ESO ELT project. To use a different configuration file, options --clang-tidy-config
, --checkstyle-config
, and --pylint-config
can be used during the project configuration phase. The Absolute or relative path to the configuration file should be used as a parameter. Please note that in case of the clang-tidy configuration file should be in YAML format. For example:
$ waf configure --clang-tidy-config=./alt_clang-tidy.yml
It is also possible to set configuration files and other options for linter tools, by passing appropriate arguments to wtools.project.declare_project method. For example:
By default the linter tool will not run for up to date targets until inputs changes. To force an execution of all tests run with option --lintall
:
$ waf lint --lintall
Note that the lint
command will also update the necessary binaries, like the build
command so there is no need to run build
first. I.e. run waf lint
instead of waf build lint
.
The clang-tidy tool, by default, is configured to analyze all headers in project sources. This behavior may be changed by setting cxx.clang_tidy_header_filter
or/and cxx.clang_tidy_line_filter
parameters in top-level wscript wtools.project.declare_project method. These parameters will be passed directly as --header-filter
and --line-filter
options to the clang-tidy tool. More information about the usage of these options may be found in clang-tidy documentation.
$ waf check
Executes all code verification commands. (Currently waf test
and waf lint
)
Note that the same arguments for those commands can be used here.
$ waf eclipse
Generate the Eclipse IDE project and support files from waf wscripts.
This simplifies the usage of the Eclipse IDE by automatically adding various waf calls (to configure, build and clean for example) and by automatically configuring search paths for dependencies.
Support files for C/C++, Python and Java are generated. The execution of the command will overwrite previously present Eclipse configuration files. The command can be run multiple times if the source tree changes.
$ waf install
Install the built artefacts to a destination directory structure. The default destination directory is /usr/local
. The destination directory can be changed by defining the $PREFIX
variable at the configuration step:
$ PREFIX=/home/user/my/destination/dir waf configure
In the destination directory a UNIX-like directory structure will be created (bin/
for binaries, lib64/
for 64-bit libraries, lib/python-3.5
for Python modules, include/
for C/C++ includes and so on) and populated with the respective artefacts generated during the build phase.
To effectively use the artefacts from the destination directory the user should set, most likely via LMOD configuration, the correct variables in the environment to point to this directory structure. These variables include for example:
PATH
, to find the executables without having to specify the whole path to the binary, should include $PREFIX/bin
LD_LIBRARY_PATH
, to find the dynamically linked libraries, should include $PREFIX/lib64
PYTHONPATH
, to find the Python modules, should include $PREFIX/lib/python-3.5/site-packages
CLASSPATH
, to find the generated JARs, should include $PREFIX/lib
and eventually explicitly the JAR files there installed$ waf build --with-docs
Will generate the doxygen documentation for the project and build sphinx documentation modules. If a file doxy.conf
is found in the root of the project tree then this will be used as the configuration to doxygen. If a configuration file is not found then one will be generated automatically with a standard set of configuration options. The file can be customized and eventually stored together with the project for the future.
$ waf audit $ waf audit --target=java.jarExPb
Will audit dependencies in the project. This means that on the output all the defined dependencies, internal or external, will be printed as well as their status according to the audit: either ok
or not found
. When found an indication of where the dependency comes from will be also printed:
use_store
, an explicit definition as a configuration option (usually therefore an external dependency)taskgen
, a dynamically generated task in the build process of the current projectSeeing a not found
should trigger the user to consider if a dependency is missing, due for example to a typo or a missing configuration in the project level wscript. If the whole project build, passes tests and works even with a not found
dependency then it is possible that it is redundant or the needed files are somehow picked up in a system directory.
An example output may look like:
Auditing use names for "java.jarEx-test" declared in "/root/wtools/svn/wtools/test/robot/project/java/jarEx" Checking use name `fofofof` : not found (possible error) Checking use name `java.jarEx` : ok (taskgen) Checking use name `JAVATEST` : ok (use store) ----------------------------------------------------------------------------------------------------------- Auditing use names for "java.jarExPb" declared in "/root/wtools/svn/wtools/test/robot/project/java/jarExPb" Checking use name `PROTOBUF` : ok (use store) ----------------------------------------------------------------------------------------------------------------- Auditing use names for "java.jarExPbDep" declared in "/root/wtools/svn/wtools/test/robot/project/java/jarExPbDep" Checking use name `java.jarExPb` : ok (taskgen) Checking use name `PROTOBUF` : ok (use store)
In the first module we can see that a missing dependecy fofofof
is highlighted, while the other dependencies are all satisfied either by external configuration definition (PROTOBUF
) or by internal dependencies.
$ waf shell --targets=proj.level.mod
Spawns a shell that contains an environment very close to the one waf is executing tasks.
This is useful to debug by hand failing tests or programs, without the need to manually set up the environment (ie. various PATH variables pointing to the correct dependencies) so stuff can effectively run.
Taskgens to be used are passed via –targets (separated by comma if multiple). If not passed the shell will include all the taskgens available at the current source structure (ie. module or package where waf shell was run)
Features that would like to expose something in waf shell should put that information in their taskgen and then wafshell can retrieve and handle that (see linting example where the lint tool puts the lint command lines in the tg and wafshell will print them if available)
In the executed wafshell, the tool will provide some bash function to ease access to certain most used operation for the requested taskgens, such as:
UT_RUN_{taskgen}
which will execute the unit tests for the given taskgen in a waf-like environmentLINT_RUN_{taskgen}
which will execute the linting operations for the given taskgen with waf-like parameters. Note: this bash function receives as parameters the files (or directories) to lint.Build time executions can be greatly improved by using the wafcache
extra that tries to cache every artefact in a waf build, reducing the needed rebuilds between different users and projects, should waf understand that the artefact inputs are exactly identical. The cache can be either on a local file system or in the cloud, supporting services such as AWS S3, Google cloud and MinIO S3.
The wafcache
extra tool can be enabled in wtools simply by exporting the environment variable WAFCACHE_ENABLE
to some value (and disabled by unsetting it) during the configuration stage.
A series of other environment variables defined in the standard wafcache
extra define the behaviour of the cache, for example if it is on a local filesystem or remote. Such variables are defined in the wafcache
extra, see this link for the documentation.
The default is to write the cache file in the user directory
so be aware of this as in some cases (ie. shared NFS mount) this may not be very performant. Also the default cache sizes are rather generous and could create problems with small partitions or mounts with quota limitation, so be aware of them and set the parameters accordingly.
Again: you must review the WAFCACHE
variable values to have performance improvements and not create problems on your system. Using a personal and/or system wide wafcache
LMOD module file can be a good way to simplify and unify such settings.
There are four variants of build scripts when using wtools
.
Project declaration
Top level script that declares the project using wtools.project.declare_project.
Package declaration (optional)
Intermediate namespace levels that group modules by using wtools.package.declare_package.
Module declaration
Module level build script that declare what type of primary artefact to create by using wtools.module.
The structure for a package looks like this:
At the root the top-level wscript declares the package and directories to recurse into
. `-- wscript
Module directories contain a single wscript
in the module root level:
[<package>] |-- <module> | `-- wscript # module wscript |-- ... `-- wscript # package or recurse wscript
Recursion directories can be used to collect modules in logical groups, and can appear between the top level and module.
top `-- <recursion> # optional recursion directory |-- <recursion> # Which can contain other recursion directories |-- <module> # or module(s) `-- wscript # recursion wscript
The top-level wscript is used to define project name, version, required features and location of all project's modules and packages. For this purpose, the declare_project method is used. It's also a place where changes to configure method should be made. Please note for project name following convention has to be applied: the name has to start with a letter and then can contain letters, numbers, hyphen "-" or underscore"_" chars.The following code is an example of a project declaration:
In the aforementioned example, there is configure method declared that defines a set of external dependencies. More about dependencies can be found in the Dependencies section. It's important to notice that the configure definition in the top level script is placed before the project declaration (done with declare_project
as explained before) otherwise the wtools internal one will be fully overridden and not just augmented.
See declare_package. Please note for package name following convention has to be applied: the name has to start with a letter and then can contain letters, numbers, hyphen "-" or underscore"_" chars.
The name of the primary deliverable, the "target" of a module must be declared by the user since the fully qualified name of the module can be very unfriendly to interact with as e.g. a command line tool.
The following module types are supported
Name | Description |
---|---|
declare_cprogram | C/C++ program |
declare_cshlib | C/C++ shared library |
declare_cstlib | C/C++ static library |
declare_qt5cprogram | Qt5 C/C++ program |
declare_qt5cshlib | Qt5 C++ library |
declare_cprotobuf | C++ protobuf shared (default) or static library |
declare_crtidds | C++ RTI-DDS shared (default) or static library |
declare_jar | Java Archive (jar) library |
declare_jrtidds | Java RTI-DDS (jar) library |
declare_jprotobuf | Java protobuf (jar) library |
declare_config | Configuration only module |
declare_pyprogram | Python program |
declare_pypackage | Python package |
declare_pyqt5program | Python program with Qt5 usage |
declare_pyqt5package | Python package with Qt5 usage |
declare_pyprotobuf | Python protobuf module |
declare_custom | Custom module (build function must be defined by hand) |
declare_malicd | MAL ICD module |
declare_malicd_topics | MAL ICD Topics module |
declare_sphinx | Sphinx documentation module |
Wtools is using fully qualified names (FQN) for modules and packages which is defined as dot separated path from the root leading to the package or module. This means that a project can have leaf nodes with the same name, e.g. a project could have the modules foo.bar
and baz.bar
without collisions.
Examples:
./foo/ # Module: Name=foo ./bar/ # Package: Name=bar ./bar/baz # Package: Name=bar.baz ./bar/baz/foo # Module: Name=bar.baz.foo
Note:
use
variable the FQN of the module must be used.There are two kinds of dependencies that can be tracked by wtools: internal project dependencies and external dependencies. External dependencies can be third-party libraries or modules from other wtools projects.
Internal dependencies are described on module level with the argument use
, which can be a space-separated list of fully qualified module names (c.f. Module and packages) like 'package1.foo bar'
, or a Python list ['package1.foo', 'bar']
.
The following is an example where a C/C++ program fooBar
has a dependency on the library bazBar
in the same source tree.
In this case waf
can figure out what to do with the dependency since it knows what it is (because it's in the source tree).
If the dependency is outside the source tree the information about it comes from different possible sources. There are three ways to declare those dependencies in the configure
phase.
The first mechanism: check is designed for C and C++. The developer declares a package dependency in the configure method on project level script, before project definition using check_cfg
. The uselib_store
parameter is used to specify the use-name used by the modules which wants to use this dependency.
Then the dependency can be used in the module like in the example:
The example resembles what the old automatic dependencies were doing and it requests the C and library flags for cfitsio
external package and stores them in the use
name cfitsio
.
The mandatory
parameter specifies if this is a strict configuration requisite or not. If mandatory=False
and if the package is not found the configuration step will still succeed. With mandatory=True
, which is the default, it would fail.
Since wscript
is effectively Python code, to check multiple dependencies a more compact solution for a list of dependencies can be used. For example, to check for multiple C++ libraries something like the following example can be used:
The second mechanism is provided by wtools, and it allows to declare C/C++, Java and Python dependencies. The dependency description is kept in wdep file. The wdep file contains information needed to consume dependency. It's possible to specify the wdep files search path by specifying WDEPPATH
environment variable. To define a dependency, the check_wdep
method is used in a similar way like check
methods described in the previous subsection. The dependency should be declared in the configure method on project level script, before project definition. For example:
commons
. If uselib_store is not provided, then the uppercased module name will be used as a name. The mandatory
parameter indicates if dependency is optional or not, default is mandatory=True
.It's possible to generate wdep files for a module, so then it can be easily consumed by other projects. To generate a wdep file, the wdep
feature needs to be added in the module declaration. The feature is automatically added for following module types:
During the installation step, wdep files are installed in ${PREFIX}/share/wdep
directory.
Note: This is not recommended or best practice, because the paths are stored in the project and might not be the same when the same project is built on another host.
It is also possible to declare waf environment variables to make dependency available during the build. These are not system environment variables (as in $LD_LIBRARY_PATH
) but the internal environment used by waf.
For example:
use
name of PROTOBUF
(what follows CLASSPATH_
).jacocoagent.jar
in a list of directories and set it in the variable so it can be used with the use
name JACOCO
.To recurse from one set of directories to the next without declaring anything in the current level use wtools.recurse module.
Recurse takes a list or space separated string of patterns:
Attribute | Type | Meaning |
---|---|---|
includes | strings/list | Specifies private include directories relative to the current script. |
export_includes | strings/list | Specifies public include directories relative to the current script. |
defines | strings/list | Specifies macro definition as ‘'KEY=VALUE’pairs.\ilinebr </td> </tr> <tr class="markdownTableRowEven"> <td class="markdownTableBodyNone"> cxxflags\ilinebr </td> <td class="markdownTableBodyNone"> strings/list\ilinebr </td> <td class="markdownTableBodyNone"> Specifies C++ compiler flags ‘’-Wall'`.\ilinebr </td> </tr> <tr class="markdownTableRowOdd"> <td class="markdownTableBodyNone"> cflags\ilinebr </td> <td class="markdownTableBodyNone"> strings/list\ilinebr </td> <td class="markdownTableBodyNone"> Like cxxflags` but for C. |
See the waf documentation for more attributes: https://waf.io/book/.
'foo bar'
is equivalent to ['foo', 'bar']
.Example: Change of include directory
wtools will almost always provide sane defaults if you follow the standard conventions. However, the user can provide their own attributes in cases where it's necessary to customize this behavior.
One example is the case when a VLTSW module is also used in ELT using different build systems. The user can then provide customized locations for where headers are located. For example, the VLTSW module structure have both public and private headers in the directory <root>/include
, not separated as specified for wtools where public headers are located in <src>/include
and private headers in <src>
. In this case we simply override the defaults and tell wtools where the headers are located. The export_includes
attribute is used to forward the include directory to dependent modules as well as taking care of installing them.
Sometimes this happens that two or more modules try to install the same file. You can use waf -v option on the install step to check if that's the case.
If you get a similar output, you should double-check that you don't create a file with the same name and path from two different modules.
Yes, it's allowed to build, lint, test or install a subtree of the project.
No, it's not allowed to configure a subtree of the project. Configuration settings should be put in top-level wscript and configure command can be executed only on the project level node for the whole tree.
Recent versions of wtools will also notify this, should you anyway try to do it:
Make sure that the configure/_build_ method is defined before the call to declare_XYZ in the wscript and not after.
To exemplify one could get from a C++
artefact something like:
While building and executing unit tests inside the waf framework, waf will take care of setting the needed environment variables, like LD_LIBRARY_PATH
, CLASSPATH
and PYTHON_PATH
, depending on the defined dependencies in the wscripts.
But once you install your artefacts you have to take care of setting them (by hand or via LMOD scripts) so your executables will be able to retrieve all the necessary dependencies.
Straight out of waf –help:
So remeber: waf –help is your friend!
Did you rerun waf configure on your project level?
When you run waf configure
on your project you get at the end something like:
The warning is telling you that in one of your wscripts
you are trying to set a dependency to foobar.bazbar.simFooBarBazLib
but this module is not found by waf. Therefore if this is a vital dependency then the further steps of your build or test execution may fail as this dependency may be missing if it is not brought in in some other way, for example via a system library import mechanism (ie. setting by hand the PYTHON_PATH).
The configuration will still succeed for compatibility with older releases of wtools
but this will be removed in the future as stated by the warning message and described in the referenced ticket.
What you should do is: check where the dependency is defined (ie. by searching with grep
for the occourence of that use
) and make sure there isn't for example a typo in the dependency name or that the dependency has not moved (either inside the project structure or in another project, that now needs to be imported in your environment). In short you should make sure the dependency you are claiming to be using is indeed satisfied by your setup.
Additionally it is highly recommented to use the --audit
command line to audit dependencies, as described in the Auditing dependencies section.
While running the tests, with waf test
, of a Python module it may happen that you may get a test failure due to missing imports that may confuse you as the import is refering to files in source
and not in test
code. The output may for example look like:
Where the ImportError
may refer to something inside a file contained in the src
directory and not a test
.
The explanation is two-fold:
doctests
(enabled by default for all Python modules except pyqt5
ones) then the test runner will search, and therefore try to import, also all the Python files under src
for tests. If you are not using doctests
you may want to disable them passing with_doctest=False
to your module.test
code does automatically put a use dependency
to the source module. Therefore if your test will import some module from the source it may still trigger the same problem indirectly if the test is then using the source part with the specific import.All in all, this behaviour is usually an indication that most likely the module is missing a use dependency
(either a dependency to another module in the same project or an external project) that would be in worse case show up once the project is deployed.
It is therefore highly recommended to doublecheck for missing definition of use dependencies
for the module.
The waf build system tries to optimize the build for speed and by default will spawn a number of parallel tasks equal to the number of CPUs on the machine. So for example if the machine has 8 CPUs, waf will execute up to 8 tasks (compilations, linkings, code generation, documentation and so on) at the same time.
If each of these tasks consume a lot of memory, running them in parallel may cause the system to run out of memory and start accessing swap which is usually much slower and this will have a negative impact on the overall time of the build execution. Typical tasks where the compilation takes huge amounts of memory is for example the pybind11
usage for Python bindings for C++, which is heavily used as of now for example in CII/MAL. The build system of course cannot know beforehand how much resources a specific task may be using.
Should such a situation occour the user can instruct waf to limit the number of parallel tasks to execute using the -j <value>
command line flag, where a value of 1 means of course that a single operation is done at a time. The optimal value strongly depends on the number of CPUs, amount of RAM and type of operations that are requested to be done to the build system and cannot therefore be universally suggested.
Term | Description |
---|---|
Tool | It's a Waf extension that usually brings a new functionality or support for a new language. |
Target | It's a result file or files of the build process. |
Taskgen or task generator | For each target a task generator object is created. Task generators encapsulate the creation process of tasks that are later executed during the build. |
Feature | In wtools context is a set of tools. In Waf context, it's a system of extending behavior of task generators. |
Command | Is an action that a user performs during the build process. For example, configure. |