Methods
source()
Method used to retrieve the source code from any other external origin like github using $ git clone
or just a regular download.
For example, “exporting” the source code files, together with the conanfile.py file, can be handy if the source code is not under version control. But if the source code is available in a repository, you can directly get it from there:
from conans import ConanFile
class HelloConan(ConanFile):
name = "hello"
version = "0.1"
settings = "os", "compiler", "build_type", "arch"
def source(self):
self.run("git clone https://github.com/conan-io/hello.git")
# You can also change branch, commit or whatever
# self.run("cd hello && git checkout 2fe5...")
#
# Or using the Git class:
# git = tools.Git(folder="hello")
# git.clone("https://github.com/conan-io/hello.git")
This will work, as long as git is in your current path (so in Win you probably want to run things in msysgit, cmder, etc). You can also use another VCS or direct download/unzip. For that purpose, we have provided some helpers, but you can use your own code or origin as well. This is a snippet of the conanfile of the Poco library:
from conans import ConanFile
from conans.tools import download, unzip, check_md5, check_sha1, check_sha256
import os
import shutil
class PocoConan(ConanFile):
name = "poco"
version = "1.6.0"
def source(self):
zip_name = "poco-1.6.0-release.zip"
download("https://github.com/pocoproject/poco/archive/poco-1.6.0-release.zip", zip_name)
# check_md5(zip_name, "51e11f2c02a36689d6ed655b6fff9ec9")
# check_sha1(zip_name, "8d87812ce591ced8ce3a022beec1df1c8b2fac87")
# check_sha256(zip_name, "653f983c30974d292de58444626884bee84a2731989ff5a336b93a0fef168d79")
unzip(zip_name)
shutil.move("poco-poco-1.6.0-release", "poco")
os.unlink(zip_name)
The download, unzip utilities can be imported from conan, but you can also use your own code here
to retrieve source code from any origin. You can even create packages for pre-compiled libraries
you already have, even if you don’t have the source code. You can download the binaries, skip
the build()
method and define your package()
and package_info()
accordingly.
You can also use check_md5()
, check_sha1()
and check_sha256()
from the tools module to
verify that a package is downloaded correctly.
Note
It is very important to recall that the source()
method will be executed just once, and the source code will be shared for all the
package builds. So it is not a good idea to conditionally use settings or options to make changes or patches on the source code. Maybe
the only setting that makes sense is the OS self.settings.os
, if not doing cross-building, for example to retrieve different
sources:
def source(self):
if platform.system() == "Windows":
# download some Win source zip
else:
# download sources from Nix systems in a tgz
If you need to patch the source code or build scripts differently for different variants of your packages, you can do it in the
build()
method, which uses a different folder and source code copy for each variant.
def build(self):
tools.patch(patch_file="0001-fix.patch")
build()
This method is used to build the source code of the recipe using the desired commands. You can use your command line tools to invoke your build system or any of the build helpers provided with Conan.
def build(self):
cmake = CMake(self)
self.run("cmake . %s" % (cmake.command_line))
self.run("cmake --build . %s" % cmake.build_config)
Build helpers
You can use these classes to prepare your build system’s command invocation:
CMake: Prepares the invocation of cmake command with your settings.
AutoToolsBuildEnvironment: If you are using configure/Makefile to build your project you can use this helper. Read more: Building with Autotools.
MSBuild: If you are using Visual Studio compiler directly to build your project you can use this helper MSBuild(). For lower level control, the VisualStudioBuildEnvironment can also be used: VisualStudioBuildEnvironment.
(Unit) Testing your library
We have seen how to run package tests with conan, but what if we want to run full unit tests on
our library before packaging, so that they are run for every build configuration?
Nothing special is required here. We can just launch the tests from the last command in our
build()
method:
def build(self):
cmake = CMake(self)
cmake.configure()
cmake.build()
# here you can run CTest, launch your binaries, etc
cmake.test()
package()
The actual creation of the package, once that it is built, is done in the package()
method. Using the self.copy()
method, artifacts
are copied from the build folder to the package folder.
The syntax of self.copy
inside package()
is as follows:
self.copy(pattern, dst="", src="", keep_path=True, symlinks=None, excludes=None, ignore_case=True)
Returns: A list with absolute paths of the files copied in the destination folder.
- Parameters:
pattern (Required): A pattern following fnmatch syntax of the files you want to copy, from the build to the package folders. Typically something like
*.lib
or*.h
.src (Optional, Defaulted to
""
): The folder where you want to search the files in the build folder. If you know that your libraries when you build your package will be in build/lib, you will typically usebuild/lib
in this parameter. Leaving it empty means the root build folder in local cache.dst (Optional, Defaulted to
""
): Destination folder in the package. They will typically beinclude
for headers,lib
for libraries and so on, though you can use any convention you like. Leaving it empty means the root package folder in local cache.keep_path (Optional, Defaulted to
True
): Means if you want to keep the relative path when you copy the files from the src folder to the dst one. Typically headers are packaged with relative path.symlinks (Optional, Defaulted to
None
): Set it to True to activate symlink copying, like typical lib.so->lib.so.9.excludes (Optional, Defaulted to
None
): Single pattern or a tuple of patterns to be excluded from the copy. If a file matches both the include and the exclude pattern, it will be excluded.ignore_case (Optional, Defaulted to
True
): If enabled, it will do a case-insensitive pattern matching.
For example:
self.copy("*.h", "include", "build/include") #keep_path default is True
The final path in the package will be: include/mylib/path/header.h
, and as the include is usually added to the path, the includes
will be in the form: #include "mylib/path/header.h"
which is something desired.
keep_path=False
is something typically desired for libraries, both static and dynamic. Some compilers as MSVC, put them in paths as
Debug/x64/MyLib/Mylib.lib. Using this option, we could write:
self.copy("*.lib", "lib", "", keep_path=False)
And it will copy the lib to the package folder lib/Mylib.lib, which can be linked easily.
Note
If you are using CMake and you have an install target defined in your CMakeLists.txt, you might be able to reuse it for this
package()
method. Please check How to reuse cmake install for package() method.
This method copies files from build/source folder to the package folder depending on two situations:
Build folder and source folder are the same: Normally during conan create source folder content is copied to the build folder. In this situation
src
parameter ofself.copy()
will be relative to the build folder in the local cache.Build folder is different from source folder: When developing a package recipe and source and build folder are different (conan package . --source-folder=source --build-folder=build) or when no_copy_source is defined, every
self.copy()
is internally called twice: One will copy from the source folder (src
parameter ofself.copy()
will point to the source folder), and the other will copy from the build folder (src
parameter ofself.copy()
will point to the build folder).
package_info()
cpp_info
Each package has to specify certain build information for its consumers. This can be done in the cpp_info
attribute within the
package_info()
method.
The cpp_info attribute has the following properties you can assign/append to:
self.cpp_info.name = "<PKG_NAME>"
self.cpp_info.names["generator_name"] = "<PKG_NAME>"
self.cpp_info.includedirs = ['include'] # Ordered list of include paths
self.cpp_info.libs = [] # The libs to link against
self.cpp_info.system_libs = [] # System libs to link against
self.cpp_info.libdirs = ['lib'] # Directories where libraries can be found
self.cpp_info.resdirs = ['res'] # Directories where resources, data, etc. can be found
self.cpp_info.bindirs = ['bin'] # Directories where executables and shared libs can be found
self.cpp_info.srcdirs = [] # Directories where sources can be found (debugging, reusing sources)
self.cpp_info.build_modules = {} # Build system utility module files
self.cpp_info.defines = [] # preprocessor definitions
self.cpp_info.cflags = [] # pure C flags
self.cpp_info.cxxflags = [] # C++ compilation flags
self.cpp_info.sharedlinkflags = [] # linker flags
self.cpp_info.exelinkflags = [] # linker flags
self.cpp_info.components # Dictionary with the different components a package may have
self.cpp_info.requires = None # List of components from requirements
name: Alternative name for the package to be used by generators.
includedirs: List of relative paths (starting from the package root) of directories where headers can be found. By default it is initialized to
['include']
, and it is rarely changed.libs: Ordered list of libs the client should link against. Empty by default, it is common that different configurations produce different library names. For example:
def package_info(self): if not self.settings.os == "Windows": self.cpp_info.libs = ["libzmq-static.a"] if self.options.static else ["libzmq.so"] else: ...
libdirs: List of relative paths (starting from the package root) of directories in which to find library object binaries (*.lib, *.a, *.so, *.dylib). By default it is initialized to
['lib']
, and it is rarely changed.resdirs: List of relative paths (starting from the package root) of directories in which to find resource files (images, xml, etc). By default it is initialized to
['res']
, and it is rarely changed.bindirs: List of relative paths (starting from the package root) of directories in which to find library runtime binaries (like Windows .dlls). By default it is initialized to
['bin']
, and it is rarely changed.srcdirs: List of relative paths (starting from the package root) of directories in which to find sources (like .c, .cpp). By default it is empty. It might be used to store sources (for later debugging of packages, or to reuse those sources building them in other packages too).
build_modules: Dictionary of lists per generator containing relative paths to build system related utility module files created by the package. Used by CMake generators to include .cmake files with functions for consumers. e.g:
self.cpp_info.build_modules["cmake_find_package"].append("cmake/myfunctions.cmake")
. Those files will be included automatically in cmake/cmake_multi generators when using conan_basic_setup() and will be automatically added in cmake_find_package/cmake_find_package_multi generators when find_package() is used.defines: Ordered list of preprocessor directives. It is common that the consumers have to specify some sort of defines in some cases, so that including the library headers matches the binaries.
system_libs: Ordered list of system libs the consumer should link against. Empty by default.
cflags, cxxflags, sharedlinkflags, exelinkflags: List of flags that the consumer should activate for proper behavior. Usage of C++11 could be configured here, for example, although it is true that the consumer may want to do some flag processing to check if different dependencies are setting incompatible flags (c++11 after c++14).
if self.options.static: if self.settings.compiler == "Visual Studio": self.cpp_info.libs.append("ws2_32") self.cpp_info.defines = ["ZMQ_STATIC"] if not self.settings.os == "Windows": self.cpp_info.cxxflags = ["-pthread"]
Note that due to the way that some build systems, like CMake, manage forward and back slashes, it might be more robust passing flags for Visual Studio compiler with dash instead. Using
"/NODEFAULTLIB:MSVCRT"
, for example, might fail when using CMake targets mode, so the following is preferred and works both in the global and targets mode of CMake:def package_info(self): self.cpp_info.exelinkflags = ["-NODEFAULTLIB:MSVCRT", "-DEFAULTLIB:LIBCMT"]
name: Alternative name for the package so generators can take into account in order to generate targets or file names.
components: [Experimental] Dictionary with names as keys and a component object as value to model the different components a package may have: libraries, executables… Read more about this feature at Using Components.
requires: [Experimental] List of components from the requirements this package (and its consumers) should link with. It will be used by generators that add support for components features (Using Components).
If your recipe has requirements, you can access to the information stored in the cpp_info
of your requirements
using the deps_cpp_info
object:
class OtherConan(ConanFile):
name = "OtherLib"
version = "1.0"
requires = "mylib/1.6.0@conan/stable"
def build(self):
self.output.warn(self.deps_cpp_info["mylib"].libdirs)
Note
Please take into account that defining self.cpp_info.bindirs
directories, does not have any effect on system paths, PATH environment
variable, nor will be directly accessible by consumers. self.cpp_info
information is translated to build-systems information via
generators, for example for CMake, it will be a variable in conanbuildinfo.cmake
. If you want a package to make accessible its
executables to its consumers, you have to specify it with self.env_info
as described in env_info.
env_info
Each package can also define some environment variables that the package needs to be reused. It’s specially useful for
installer packages, to set the path with the “bin” folder of the packaged application. This can be done in
the env_info
attribute within the package_info()
method.
self.env_info.path.append("ANOTHER VALUE") # Append "ANOTHER VALUE" to the path variable
self.env_info.othervar = "OTHER VALUE" # Assign "OTHER VALUE" to the othervar variable
self.env_info.thirdvar.append("some value") # Every variable can be set or appended a new value
One of the most typical usages for the PATH environment variable, would be to add the current binary package directories to the path, so consumers can use those executables easily:
# assuming the binaries are in the "bin" subfolder
self.env_info.PATH.append(os.path.join(self.package_folder, "bin"))
The virtualenv generator will use the self.env_info
variables to prepare a script to
activate/deactivate a virtual environment. However, this could be directly done using the virtualrunenv generator.
They will be automatically applied before calling the consumer conanfile.py methods source()
, build()
, package()
and
imports()
.
If your recipe has requirements, you can access to your requirements env_info
as well using the deps_env_info
object.
class OtherConan(ConanFile):
name = "OtherLib"
version = "1.0"
requires = "mylib/1.6.0@conan/stable"
def build(self):
self.output.warn(self.deps_env_info["mylib"].othervar)
user_info
If you need to declare custom variables not related with C/C++ (cpp_info
) and the variables are not environment variables
(env_info
), you can use the self.user_info
object.
Currently only the cmake
, cmake_multi
and txt
generators supports user_info
variables.
class MyLibConan(ConanFile):
name = "mylib"
version = "1.6.0"
# ...
def package_info(self):
self.user_info.var1 = 2
For the example above, in the cmake
and cmake_multi
generators, a variable CONAN_USER_MYLIB_var1
will be declared. If your
recipe has requirements, you can access to your requirements user_info
using the deps_user_info
object.
class OtherConan(ConanFile):
name = "otherlib"
version = "1.0"
requires = "mylib/1.6.0@conan/stable"
def build(self):
self.out.warn(self.deps_user_info["mylib"].var1)
Important
Both env_info
and user_info
objects store information in a “key <-> value” form and the values are always considered strings.
This is done for serialization purposes to conanbuildinfo.txt files and to avoid the deserialization of complex structures. It is up to the consumer to convert the string to the expected type:
# In a dependency
self.user_info.jars="jar1.jar, jar2.jar, jar3.jar" # Use a string, not a list
...
# In the dependent conanfile
jars = self.deps_user_info["pkg"].jars
jar_list = jars.replace(" ", "").split(",")
set_name(), set_version()
Dynamically define name
and version
attributes in the recipe with these methods. The following example
defines the package name reading it from a name.txt file and the version from the branch and commit of the
recipe’s repository.
These functions are executed after assigning the values of the name
and version
if they are provided
from the command line.
from conans import ConanFile, tools
class HelloConan(ConanFile):
def set_name(self):
# Read the value from 'name.txt' if it is not provided in the command line
self.name = self.name or tools.load("name.txt")
def set_version(self):
git = tools.Git()
self.version = "%s_%s" % (git.get_branch(), git.get_revision())
The set_name()
and set_version()
methods should respectively set the self.name
and self.version
attributes.
These methods are only executed when the recipe is in a user folder (export, create and
install <path> commands).
The above example uses the current working directory as the one to resolve the relative “name.txt” path and the git repository. That means that the “name.txt” should exist in the directory where conan was launched. To define a relative path to the conanfile.py, irrespective of the current working directory it is necessary to do:
import os
from conans import ConanFile, tools
class HelloConan(ConanFile):
def set_name(self):
f = os.path.join(self.recipe_folder, "name.txt")
self.name = tools.load(f)
def set_version(self):
git = tools.Git(folder=self.recipe_folder)
self.version = "%s_%s" % (git.get_branch(), git.get_revision())
Warning
The set_name()
and set_version()
methods are alternatives to the name
and version
attributes. It is
not advised or supported to define both a name
attribute and a set_name()
method. Likewise, it is
not advised or supported to define both a version
attribute and a set_version()
method. If you define both,
you may experience unexpected behavior.
See also
See more examples in this howto.
configure(), config_options()
If the package options and settings are related, and you want to configure either, you can do so in the configure()
and
config_options()
methods.
class MyLibConan(ConanFile):
name = "MyLib"
version = "2.5"
settings = "os", "compiler", "build_type", "arch"
options = {"static": [True, False],
"header_only": [True False]}
def configure(self):
# If header only, the compiler, etc, does not affect the package!
if self.options.header_only:
self.settings.clear()
self.options.remove("static")
The package has 2 options set, to be compiled as a static (as opposed to shared) library, and also not to involve any builds, because
header-only libraries will be used. In this case, the settings that would affect a normal build, and even the other option (static vs
shared) do not make sense, so we just clear them. That means, if someone consumes MyLib with the header_only=True
option, the package
downloaded and used will be the same, irrespective of the OS, compiler or architecture the consumer is building with.
You can also restrict the settings used deleting any specific one. For example, it is quite common
for C libraries to delete the compiler.libcxx
and compiler.cppstd
as your library does not
depend on any C++ standard library:
def configure(self):
del self.settings.compiler.libcxx
del self.settings.compiler.cppstd
The most typical usage would be the one with configure()
while config_options()
should be used more sparingly. config_options()
is used to configure or constraint the available options in a package, before they are given a value. So when a value is tried to be
assigned it will raise an error. For example, let’s suppose that a certain package library cannot be built as shared library in Windows, it
can be done:
def config_options(self):
if self.settings.os == "Windows":
del self.options.shared
This will be executed before the actual assignment of options
(then, such options
values cannot be used inside this function), so
the command conan install -o pkg:shared=True will raise an exception in Windows saying that shared
is not an option for such
package.
These methods can also be used to assign values to options as seen in options. Values assigned
in the configure()
method cannot be overriden, while values assigned in config_options()
can.
Invalid configuration
Conan allows the recipe creator to declare invalid configurations, those that are known not to work
with the library being packaged. There is an especial kind of exception that can be raised from
the configure()
method to state this situation: conans.errors.ConanInvalidConfiguration
. Here
it is an example of a recipe for a library that doesn’t support Windows operating system:
def configure(self):
if self.settings.os != "Windows":
raise ConanInvalidConfiguration("Library MyLib is only supported for Windows")
This exception will be propagated and Conan application will finish with a special return code.
Note
For managing invalid configurations, please check the new experimental validate()
method (validate()).
validate()
Warning
This is an experimental feature subject to breaking changes in future releases.
Available since: 1.32.0
The validate()
method can be used to mark a binary as “impossible” or invalid for a given configuration. For example,
if a given library does not build or work at all in Windows it can be defined as:
from conans import ConanFile
from conans.errors import ConanInvalidConfiguration
class Pkg(ConanFile):
settings = "os"
def validate(self):
if self.settings.os == "Windows":
raise ConanInvalidConfiguration("Windows not supported")
If you try to use, consume or build such a package, it will raise an error, returning exit code exit code:
$ conan create . pkg/0.1@ -s os=Windows
...
Packages
pkg/0.1:INVALID - Invalid
...
> ERROR: There are invalid packages (packages that cannot exist for this configuration):
> pkg/0.1: Invalid ID: Windows not supported
A major difference with configure()
is that this information can be queried with the conan info
command, for example this
is possible without getting an error:
$ conan export . test/0.1@user/testing
...
> test/0.1@user/testing: Exported revision: ...
$ conan info test/0.1@user/testing
>test/0.1@user/testing
ID: INVALID
BuildID: None
Remote: None
...
Another important difference with the configure()
method, is that validate()
is evaluated after the graph has been computed and
the information has been propagated downstream. So the values used in validate()
are guaranteed to be final real values,
while values at configure()
time are not. This might be important, for example when checking values of options of dependencies:
from conans import ConanFile
from conans.errors import ConanInvalidConfiguration
class Pkg(ConanFile):
requires = "dep/0.1"
def validate(self):
if self.options["dep"].myoption == 2:
raise ConanInvalidConfiguration("Option 2 of 'dep' not supported")
If a package uses compatible_packages
feature, it should not add to those compatible packages configurations that will not be valid,
for example:
from conans import ConanFile
from conans.errors import ConanInvalidConfiguration
class Pkg(ConanFile):
settings = "os", "build_type"
def validate(self):
if self.settings.os == "Windows":
raise ConanInvalidConfiguration("Windows not supported")
def package_id(self):
if self.settings.build_type == "Debug" and self.settings.os != "Windows":
compatible_pkg = self.info.clone()
compatible_pkg.settings.build_type = "Release"
self.compatible_packages.append(compatible_pkg)
Note the self.settings.os != "Windows"
in the package_id()
. If this is not provided, the validate()
might still work and
raise an error, but in the best case it will be wasted resources (compatible packages do more API calls to check them), so it is
strongly recommended to properly define the package_id()
method to no include incompatible configurations.
requirements()
Besides the requires
field, more advanced requirement logic can be defined in the requirements()
optional method, using for example
values from the package settings
or options
:
def requirements(self):
if self.options.myoption:
self.requires("zlib/1.2@drl/testing")
else:
self.requires("opencv/2.2@drl/stable")
This is a powerful mechanism for handling conditional dependencies.
When you are inside the method, each call to self.requires()
will add the corresponding requirement to the current list of requirements.
It also has optional parameters that allow defining the special cases, as is shown below:
def requirements(self):
self.requires("zlib/1.2@drl/testing", private=True, override=False)
self.requires()
parameters:
override (Optional, Defaulted to
False
): True means that this is not an actual requirement, but something to be passed upstream and override possible existing values.private (Optional, Defaulted to
False
): True means that this requirement will be somewhat embedded, and totally hidden. It might be necessary in some extreme cases, like having to use two different versions of the same library (provided that they are totally hidden in a shared library, for example), but it is mostly discouraged otherwise.
Note
To prevent accidental override of transitive dependencies, check the config variable general.error_on_override or the environment variable CONAN_ERROR_ON_OVERRIDE.
build_requirements()
Build requirements are requirements that are only installed and used when the package is built from sources. If there is an existing pre-compiled binary, then the build requirements for this package will not be retrieved.
This method is useful for defining conditional build requirements, for example:
class MyPkg(ConanFile):
def build_requirements(self):
if self.settings.os == "Windows":
self.build_requires("tool_win/0.1@user/stable")
See also
system_requirements()
It is possible to install system-wide packages from Conan. Just add a system_requirements()
method to your conanfile and specify what
you need there.
For a special use case you can use also conans.tools.os_info
object to detect the operating system, version and distribution (Linux):
os_info.is_linux
: True if Linux.os_info.is_windows
: True if Windows.os_info.is_macos
: True if macOS.os_info.is_freebsd
: True if FreeBSD.os_info.is_solaris
: True if SunOS.os_info.os_version
: OS version.os_info.os_version_name
: Common name of the OS (Windows 7, Mountain Lion, Wheezy…).os_info.linux_distro
: Linux distribution name (None if not Linux).os_info.bash_path
: Returns the absolute path to a bash in the system.os_info.uname(options=None)
: Runs the “uname” command and returns the output. You can pass arguments with the options parameter.os_info.detect_windows_subsystem()
: Returns “MSYS”, “MSYS2”, “CYGWIN” or “WSL” if any of these Windows subsystems are detected.
Warning
The values returned from some of these variables (linux_distro
, os_version
and os_version_name
) use the external
dependency distro, values returned might be different from one version to another,
please check their changelog for bugfixes and new features.
You can also use SystemPackageTool
class, that will automatically invoke the right system package
tool: apt, yum, dnf, pkg, pkgutil, brew and pacman depending on the
system we are running.
from conans.tools import os_info, SystemPackageTool
def system_requirements(self):
pack_name = None
if os_info.linux_distro == "ubuntu":
if os_info.os_version > "12":
pack_name = "package_name_in_ubuntu_10"
else:
pack_name = "package_name_in_ubuntu_12"
elif os_info.linux_distro == "fedora" or os_info.linux_distro == "centos":
pack_name = "package_name_in_fedora_and_centos"
elif os_info.is_macos:
pack_name = "package_name_in_macos"
elif os_info.is_freebsd:
pack_name = "package_name_in_freebsd"
elif os_info.is_solaris:
pack_name = "package_name_in_solaris"
if pack_name:
installer = SystemPackageTool()
installer.install(pack_name) # Install the package, will update the package database if pack_name isn't already installed
On Windows, there is no standard package manager, however choco can be invoked as an optional:
from conans.tools import os_info, SystemPackageTool, ChocolateyTool
def system_requirements(self):
if os_info.is_windows:
pack_name = "package_name_in_windows"
installer = SystemPackageTool(tool=ChocolateyTool()) # Invoke choco package manager to install the package
installer.install(pack_name)
SystemPackageTool
def SystemPackageTool(runner=None, os_info=None, tool=None, recommends=False, output=None, conanfile=None, default_mode="enabled")
Available tool classes: AptTool, YumTool, DnfTool, BrewTool, PkgTool, PkgUtilTool, ChocolateyTool, PacManTool.
- Methods:
add_repository(repository, repo_key=None): Add
repository
address in your current repo list.update(): Updates the system package manager database. It’s called automatically from the
install()
method by default.install(packages, update=True, force=False): Installs the
packages
(could be a list or a string). Ifupdate
is True it will executeupdate()
first if it’s needed. The packages won’t be installed if they are already installed at least offorce
parameter is set to True. Ifpackages
is a list the first available package will be picked (short-circuit like logical or). Note: This list of packages is intended for providing alternative names for the same package, to account for small variations of the name for the same package in different distros. To install different packages, one call toinstall()
per package is necessary.install_packages(packages, update=True, force=False, arch_names=None): Installs all
packages
(could be a list or a string). Ifupdate
is True it will executeupdate()
first if it’s needed. The packages won’t be installed if they are already installed at least offorce
parameter is set to True. Ifpackages
has a nested list or tuple, the first available package will be picked (short-circuit like logical or).installed(package_name): Verify if
package_name
is actually installed. It returnsTrue
if it is installed, otherwiseFalse
.
The use of sudo
in the internals of the install()
and update()
methods is controlled by the CONAN_SYSREQUIRES_SUDO
environment variable, so if the users don’t need sudo permissions, it is easy to opt-in/out.
When the environment variable CONAN_SYSREQUIRES_SUDO
is not defined, Conan will try to use sudo if the following conditions are met:
sudo is available in the
PATH
.The platform name is
posix
and the UID (user id) is not0
Also, when the environment variable CONAN_SYSREQUIRES_MODE
is not defined, Conan will work as if its value was enabled
unless you pass the default_mode
argument to the constructor of SystemPackageTool
. In that case, it will work as if
CONAN_SYSREQUIRES_MODE
had been defined to that value. If CONAN_SYSREQUIRES_MODE
is defined,
it will take preference and the default_mode
parameter will not affect. This can be useful when a
recipe has system requirements but we don’t want to automatically install them if the user has not
defined CONAN_SYSREQUIRES_MODE
but to warn him about the missing requirements and allowing him to
install them.
Conan will keep track of the execution of this method, so that it is not invoked again and again at every Conan command. The execution is done per package, since some packages of the same library might have different system dependencies. If you are sure that all your binary packages have the same system requirements, just add the following line to your method:
def system_requirements(self):
self.global_system_requirements=True
if ...
To install multi-arch packages it is possible passing the desired architecture manually according your package manager:
name = "foobar"
platforms = {"x86_64": "amd64", "x86": "i386"}
installer = SystemPackageTool(tool=AptTool())
installer.install("%s:%s" % (name, platforms[self.settings.arch]))
However, it requires a boilerplate which could be automatically solved by your settings in ConanFile:
installer = SystemPackageTool(conanfile=self)
installer.install(name)
The SystemPackageTool
is adapted to support possible prefixes and suffixes, according to the
instance of the package manager. It validates whether your current settings are configured for
cross-building, and if so, it will update the package name to be installed according to
self.settings.arch
.
To install more than one package at once:
def system_requirements(self):
packages = [("vim", "nano", "emacs"), "firefox", "chromium"]
installer = SystemPackageTool()
installer.install_packages(packages)
# e.g. apt-get install -y --no-recommends vim firefox chromium
The install_packages
will install the first text editor available (only one) following the tupple order, while it will install both web browsers.
imports()
Importing files copies files from the local store to your project. This feature is handy for copying shared libraries (dylib in Mac, dll in Win) to the directory of your executable, so that you don’t have to mess with your PATH to run them. But there are other use cases:
Copy an executable to your project, so that it can be easily run. A good example is the Google’s protobuf code generator.
Copy package data to your project, like configuration, images, sounds… A good example is the OpenCV demo, in which face detection XML pattern files are required.
Importing files is also very convenient in order to redistribute your application, as many times you will just have to bundle your project’s bin folder.
A typical imports()
method for shared libs could be:
def imports(self):
self.copy("*.dll", "", "bin")
self.copy("*.dylib", "", "lib")
The self.copy()
method inside imports()
supports the following arguments:
def copy(pattern, dst="", src="", root_package=None, folder=False, ignore_case=True, excludes=None, keep_path=True)
- Parameters:
pattern (Required): An fnmatch file pattern of the files that should be copied.
dst (Optional, Defaulted to
""
): Destination local folder, with reference to current directory, to which the files will be copied.src (Optional, Defaulted to
""
): Source folder in which those files will be searched. This folder will be stripped from the dst parameter. E.g., lib/Debug/x86. It accepts symbolic folder names like@bindirs
and@libdirs
which will map to theself.cpp_info.bindirs
andself.cpp_info.libdirs
of the source package, instead of a hardcoded name.root_package (Optional, Defaulted to all packages in deps): An fnmatch pattern of the package name (“OpenCV”, “Boost”) from which files will be copied.
folder (Optional, Defaulted to
False
): If enabled, it will copy the files from the local cache to a subfolder named as the package containing the files. Useful to avoid conflicting imports of files with the same name (e.g. License).ignore_case (Optional, Defaulted to
True
): If enabled, it will do a case-insensitive pattern matching.excludes (Optional, Defaulted to
None
): Allows defining a list of patterns (even a single pattern) to be excluded from the copy, even if they match the mainpattern
.keep_path (Optional, Defaulted to
True
): Means if you want to keep the relative path when you copy the files from the src folder to the dst one. Useful to ignore (keep_path=False
) path of library.dll files in the package it is imported from.
Example to collect license files from dependencies:
def imports(self):
self.copy("license*", dst="licenses", folder=True, ignore_case=True)
If you want to be able to customize the output user directory to work with both the cmake
and cmake_multi
generators, then you can
do:
def imports(self):
dest = os.getenv("CONAN_IMPORT_PATH", "bin")
self.copy("*.dll", dst=dest, src="bin")
self.copy("*.dylib*", dst=dest, src="lib")
And then use, for example: conan install . -e CONAN_IMPORT_PATH=Release -g cmake_multi
To import files from packages that have different layouts, for example a package uses folder libraries
instead of lib
,
or to import files from packages that could be in editable mode, a symbolic src
argument can be provided:
def imports(self):
self.copy("*", src="@bindirs", dst="bin")
self.copy("*", src="@libdirs", dst="lib")
This will import all files from all the dependencies self.cpp_info.bindirs
folders to the local “bin” folder, and all files
from the dependencies self.cpp_info.libdirs
folders to the local “lib” folder. This include packages that are in editable
mode and declares [libdirs]
and [bindirs]
in their editable layouts.
When a conanfile recipe has an imports()
method and it builds from sources, it will do the following:
Before running
build()
it will executeimports()
in the build folder, copying dependencies artifactsRun the
build()
method, which could use such imported binaries.Remove the copied (imported) artifacts after
build()
is finished.
You can use the keep_imports attribute to keep the imported artifacts, and maybe repackage them.
package_id()
Creates a unique ID for the package. Default package ID is calculated using settings
, options
and requires
properties. When a
package creator specifies the values for any of those properties, it is telling that any value change will require a different binary
package.
However, sometimes a package creator would need to alter the default behavior, for example, to have only one binary package for several
different compiler versions. In that case you can set a custom self.info
object implementing this method and the package ID will be
computed with the given information:
def package_id(self):
v = Version(str(self.settings.compiler.version))
if self.settings.compiler == "gcc" and (v >= "4.5" and v < "5.0"):
self.info.settings.compiler.version = "GCC 4 between 4.5 and 5.0"
Please, check the section Defining Package ABI Compatibility to get more details.
self.info
This self.info
object stores the information that will be used to compute the package ID.
This object can be manipulated to reflect the information you want in the computation of the package ID. For example, you can delete any setting or option:
def package_id(self):
del self.info.settings.compiler
del self.info.options.shared
self.info.header_only()
The package will always be the same, irrespective of the settings (OS, compiler or architecture), options and dependencies.
def package_id(self):
self.info.header_only()
self.info.vs_toolset_compatible() / self.info.vs_toolset_incompatible()
By default (vs_toolset_compatible()
mode) Conan will generate the same binary package when the compiler is Visual Studio and the
compiler.toolset
matches the specified compiler.version
. For example, if we install some packages specifying the following settings:
def package_id(self):
self.info.vs_toolset_compatible()
# self.info.vs_toolset_incompatible()
compiler="Visual Studio"
compiler.version=14
And then we install again specifying these settings:
compiler="Visual Studio"
compiler.version=15
compiler.toolset=v140
The compiler version is different, but Conan will not install a different package, because the used toolchain
in both cases are
considered the same. You can deactivate this default behavior using calling self.info.vs_toolset_incompatible()
.
This is the relation of Visual Studio versions and the compatible toolchain:
Visual Studio Version |
Compatible toolset |
---|---|
15 |
v141 |
14 |
v140 |
12 |
v120 |
11 |
v110 |
10 |
v100 |
9 |
v90 |
8 |
v80 |
self.info.discard_build_settings() / self.info.include_build_settings()
By default (discard_build_settings()
) Conan will generate the same binary when you change the os_build
or arch_build
when the
os
and arch
are declared respectively. This is because os_build
represent the machine running Conan, so, for the consumer, the
only setting that matters is where the built software will run, not where is running the compilation. The same applies to arch_build
.
With self.info.include_build_settings()
, Conan will generate different packages when you change the os_build
or arch_build
.
def package_id(self):
self.info.discard_build_settings()
# self.info.include_build_settings()
self.info.default_std_matching() / self.info.default_std_non_matching()
By default (default_std_matching()
) Conan will detect the default C++ standard of your compiler to
not generate different binary packages.
For example, you already built some gcc 6.1
packages, where the default std is gnu14
.
If you specify a value for the setting compiler.cppstd
equal to the default one, gnu14
, Conan won’t generate
new packages, because it was already the default of your compiler.
With self.info.default_std_non_matching()
, Conan will generate different packages when you specify the compiler.cppstd
even if it matches with the default of the compiler being used:
def package_id(self):
self.info.default_std_non_matching()
# self.info.default_std_matching()
Same behavior applies if you use the deprecated setting cppstd
.
Compatible packages
The package_id()
method serves to define the “canonical” binary package ID, the identifier of the binary that correspond to the
input configuration of settings and options. This canonical binary package ID will be always computed, and Conan will check for its
existence to be downloaded and installed.
If the binary of that package ID is not found, Conan lets the recipe writer define an ordered list of compatible package IDs, of other configurations that should be binary compatible and can be used as a fallback. The syntax to do this is:
from conans import ConanFile
class Pkg(ConanFile):
settings = "os", "compiler", "arch", "build_type"
def package_id(self):
if self.settings.compiler == "gcc" and self.settings.compiler.version == "4.9":
compatible_pkg = self.info.clone()
compatible_pkg.settings.compiler.version = "4.8"
self.compatible_packages.append(compatible_pkg)
This will define that, if we try to install this package with gcc 4.9
and there isn’t a binary available for that configuration, Conan will check
if there is one available built with gcc 4.8
and use it. But not the other way round.
See also
For more information about compatible packages read this
build_id()
In the general case, there is one build folder for each binary package, with the exact same hash/ID of the package. However this behavior can be changed, there are a couple of scenarios that this might be interesting:
You have a build script that generates several different configurations at once, like both debug and release artifacts, but you actually want to package and consume them separately. Same for different architectures or any other setting.
You build just one configuration (like release), but you want to create different binary packages for different consuming cases. For example, if you have created tests for the library in the build step, you might want to create two packages: one just containing the library for general usage, and another one also containing the tests. First package could be used as a reference and the other one as a tool to debug errors.
In both cases, if using different settings, the system will build twice (or more times) the same binaries, just to produce a different final
binary package. With the build_id()
method this logic can be changed. build_id()
will create a new package ID/hash for the build
folder, and you can define the logic you want in it. For example:
settings = "os", "compiler", "arch", "build_type"
def build_id(self):
self.info_build.settings.build_type = "Any"
So this recipe will generate a final different package for each debug/release configuration. But as the build_id()
will generate the
same ID for any build_type
, then just one folder and one build will be done. Such build should build both debug and release artifacts,
and then the package()
method should package them accordingly to the self.settings.build_type
value. Different builds will still be
executed if using different compilers or architectures. This method is basically an optimization of build time, avoiding multiple re-builds.
Other information like custom package options can also be changed:
def build_id(self):
self.info_build.options.myoption = 'MyValue' # any value possible
self.info_build.options.fullsource = 'Always'
If the build_id()
method does not modify the build_id
, and produce a different one than
the package_id
, then the standard behavior will be applied. Consider the following:
settings = "os", "compiler", "arch", "build_type"
def build_id(self):
if self.settings.os == "Windows":
self.info_build.settings.build_type = "Any"
This will only produce a build ID different if the package is for Windows. So the behavior
in any other OS will be the standard one, as if the build_id()
method was not defined:
the build folder will be wiped at each conan create command and a clean build will
be done.
deploy()
This method can be used in a conanfile.py to install in the system or user folder artifacts from packages.
def deploy(self):
self.copy("*.exe") # copy from current package
self.copy_deps("*.dll") # copy from dependencies
Where:
self.copy()
is theself.copy()
method executed inside package() method.self.copy_deps()
is the same asself.copy()
method inside imports() method.
Both methods allow the definition of absolute paths (to install in the system), in the dst
argument. By default, the dst
destination folder will be the current one.
The deploy()
method is designed to work on a package that is installed directly from its reference, as:
$ conan install pkg/0.1@user/channel
> ...
> pkg/0.1@user/testing deploy(): Copied 1 '.dll' files: mylib.dll
> pkg/0.1@user/testing deploy(): Copied 1 '.exe' files: myexe.exe
All other packages and dependencies, even transitive dependencies of pkg/0.1@user/testing
will not be deployed, it is the responsibility
of the installed package to deploy what it needs from its dependencies.
See also
For a different approach to deploy package files in the user space folders, check the deploy generator.
init()
This is an optional method for initializing conanfile values, designed for inheritance from python requires.
Assuming we have a base/1.1@user/testing
recipe:
class MyConanfileBase(object):
license = "MyLicense"
settings = "os", # tuple!
class PyReq(ConanFile):
name = "base"
version = "1.1"
We could reuse and inherit from it with:
class PkgTest(ConanFile):
license = "MIT"
settings = "arch", # tuple!
python_requires = "base/1.1@user/testing"
python_requires_extend = "base.MyConanfileBase"
def init(self):
base = self.python_requires["base"].module.MyConanfileBase
self.settings = base.settings + self.settings # Note, adding 2 tuples = tuple
self.license = base.license # License is overwritten
The final PkgTest
conanfile will have both os
and arch
as settings, and MyLicense
as license.
This method can also be useful if you need to unconditionally initialize class attributes like
license
or description
or any other attributes from datafiles other than
conandata.yml. For example, you have a json file containing the information about the
license
, description
and author
for the library:
{"license": "MIT", "description": "This is my awesome library.", "author": "Me"}
Then, you can load that information from the init()
method:
import os
import json
from conans import ConanFile, load
class Lib(ConanFile):
exports = "data.json"
def init(self):
data = load(os.path.join(self.recipe_folder, "data.json"))
d = json.loads(data)
self.license = d["license"]
self.description = d["description"]
self.author = d["author"]
export()
Equivalent to the exports
attribute, but in method form. It supports the self.copy()
to do pattern
based copy of files from the local user folder (the folder containing the conanfile.py) to the
cache export_folder
from conans import ConanFile
class Pkg(ConanFile):
def export(self):
self.copy("LICENSE.md")
The current folder (os.getcwd()
) and the self.export_folder
can be used in the method:
import os
from conans import ConanFile
from conans.tools import save, load
class Pkg(ConanFile):
def export(self):
# we can load files in the user folder
content = load(os.path.join(os.getcwd(), "data.txt"))
# We have access to the cache export_folder
save(os.path.join(self.export_folder, "myfile.txt"), "some content")
The self.copy
support src
and dst
subfolder arguments. The src
is relative to the
current folder (the one containing the conanfile.py). The dst
is relative to the cache
export_folder
.
from conans import ConanFile
class Pkg(ConanFile):
def export(self):
self.output.info("Executing export() method")
# will copy all .txt files from the local "subfolder" folder to the cache "mydata" one
self.copy("*.txt", src="mysubfolder", dst="mydata")
export_sources()
Equivalent to the exports_sources
attribute, but in method form. It supports the self.copy()
to do pattern
based copy of files from the local user folder (the folder containing the conanfile.py) to the
cache export_sources_folder
from conans import ConanFile
class Pkg(ConanFile):
def export_sources(self):
self.copy("LICENSE.md")
The current folder (os.getcwd()
) and the self.export_sources_folder
can be used in the method:
import os
from conans import ConanFile
from conans.tools import save, load
class Pkg(ConanFile):
def export_sources(self):
content = load(os.path.join(os.getcwd(), "data.txt"))
save(os.path.join(self.export_sources_folder, "myfile.txt"), content)
The self.copy
support src
and dst
subfolder arguments. The src
is relative to the
current folder (the one containing the conanfile.py). The dst
is relative to the cache
export_sources_folder
.
from conans import ConanFile
class Pkg(ConanFile):
def export_sources(self):
self.output.info("Executing export_sources() method")
# will copy all .txt files from the local "subfolder" folder to the cache "mydata" one
self.copy("*.txt", src="mysubfolder", dst="mydata")
generate()
Warning
This is an experimental feature subject to breaking changes in future releases.
Available since: 1.32.0
This method will run after the computation and installation of the dependency graph. This means that it will
run after a conan install command, or when a package is being built in the cache, it will be run before
calling the build()
method.
The purpose of generate()
is to prepare the build, generating the necessary files. These files would typically be:
Files containing information to locate the dependencies, as
xxxx-config.cmake
CMake config scripts, orxxxx.props
Visual Studio property files.Environment activation scripts, like
conanbuildenv.bat
orconanbuildenv.sh
, that define all the necessary environment variables necessary for the build.Toolchain files, like
conantoolchain.cmake
, that contains a mapping between the current Conan settings and options, and the build system specific syntax.General purpose build information, as a
conanbuild.conf
file that could contain information like the CMake generator or CMake toolchain file to be used in thebuild()
method.Specific build system files, like
conanvcvars.bat
, that contains the necessary Visual Studio vcvars.bat call for certain build systems like Ninja when compiling with the Microsoft compiler.
The idea is that the generate()
method implements all the necessary logic, making both the user manual builds after a conan install
very straightforward, and also the build()
method logic simpler. The build produced by a user in their local flow should result
exactly the same one as the build done in the cache with a conan create
without effort.
In many cases, the generate()
method might not be necessary, and declaring the generators
attribute could be enough:
from conans import ConanFile
class Pkg(ConanFile):
generators = "CMakeDeps", "CMakeToolchain"
But the generate()
method can explicitly instantiate those generators, customize them, or provide a complete custom
generation. For custom integrations, putting code in a common python_require
would be a good way to avoid repetition in
multiple recipes.
from conans import ConanFile
from conan.tools.cmake import CMakeToolchain
class Pkg(ConanFile):
def generate(self):
tc = CMakeToolchain(self)
# customize toolchain "tc"
tc.generate()
# Or provide your own custom logic
layout()
Warning
This is an experimental feature subject to breaking changes in future releases.
The layout()
feature will be fully functional only in the new build system integrations
(in the conan.tools space). If you are using other integrations, they
might not fully support this feature.
Available since: 1.37.0
Read about the feature here.
In the layout() method you can adjust self.folders
, self.cpp
and self.patterns
.
self.folders
self.folders.source (Defaulted to “”): Specifies a subfolder where the sources are. The
self.source_folder
attribute inside thesource(self)
andbuild(self)
methods will be set with this subfolder. But the current working directory in thesource(self)
method will not include this subfolder, because it is intended to describe where the sources are after downloading (zip, git…) them, not to force where the sources should be. As well, the export_sources, exports and scm sources will be copied to the root source directory, being the self.folders.source variable the way to describe if the fetched sources are still in a subfolder. It is used in the cache when running conan create (relative to the cache source folder) as well as in a local folder when running conan build (relative to the local current folder).self.folders.build (Defaulted to “”): Specifies a subfolder where the files from the build are. The
self.build_folder
attribute and the current working directory inside thebuild(self)
method will be set with this subfolder. It is used in the cache when running conan create (relative to the cache source folder) as well as in a local folder when running conan build (relative to the local current folder).self.folders.generators (Defaulted to “”): Specifies a subfolder where to write the files from the generators and the toolchains. In the cache, when running the conan create, this subfolder will be relative to the root build folder and when running the conan install command it will be relative to the current working directory.
self.folders.imports (Defaulted to “”): Specifies a subfolder where to write the files copied when using the
imports(self)
method in aconanfile.py
. In the cache, when running the conan create, this subfolder will be relative to the root build folder and when running the conan imports command it will be relative to the current working directory.self.folders.package (Defaulted to “”): Specifies a subfolder where to write the package files when running the conan package command. It is relative to the current working directory. This folder will not affect the package layout in the cache.
self.cpp
The layout()
method allows to declare cpp_info
objects not only for the final package (like the classic approach with
the self.cpp_info
in the package_info(self)
method) but for the self.source_folder
and self.build_folder
.
The fields of the cpp_info objects at self.info.build
and self.info.source
are the same described here.
Components are also supported.
self.patterns
You can fill the self.patterns.source
and self.patterns.build
objects describing the patterns of the files that are at the self.folders.source
and self.folders.build
to automate the package(self)
method with the LayoutPackager() tool.
The defaults are the following but you can customize anything based on the configuration (self.settings
, self.options
…):
self.patterns.source.include = ["*.h", "*.hpp", "*.hxx"]
self.patterns.source.lib = []
self.patterns.source.bin = []
self.patterns.build.include = ["*.h", "*.hpp", "*.hxx"]
self.patterns.build.lib = ["*.so", "*.so.*", "*.a", "*.lib", "*.dylib"]
self.patterns.build.bin = ["*.exe", "*.dll"]
These are all the fields that can be adjusted, both in self.patterns.source
and self.patterns.build
:
NAME |
DESCRIPTION (xxx can be either |
---|---|
include |
Patterns of the files from the folders: |
lib |
Patterns of the files from the folders: |
bin |
Patterns of the files from the folders: |
src |
Patterns of the files from the folders: |
build |
Patterns of the files from the folders: |
res |
Patterns of the files from the folders: |
framework |
Patterns of the files from the folders: |
test()
The test()
method is only used for test_package/conanfile.py recipes. It will execute immediately after build()
has been called, and its goal is to
run some executable or tests on binaries to prove the package is correctly created. Note that it is intended to be used as a
test of the package: the headers are there, the libraries are there, it is possible to link, etc., but not to run unit, integration or functional tests.
It usually takes the form:
def test(self):
if not tools.cross_building(self):
self.run(os.path.sep.join([".", "bin", "example"]))
Note the tools.cross_building()
check, as it is not possible to run executables different to the build machine architecture. In this case, it would
make sense to check the existence of the binary, or inspect it with tools like dumpbin
, lipo
, etc to do basic checks about it.
The self.run()
might need some environment help, in case the execution needs for example shared libraries location.