Methods
source()
Method used to retrieve the source code from any other external origin like github using $ git clone
or just a regular download.
For example, “exporting” the source code files, together with the conanfile.py file, can be handy if the source code is not under version control. But if the source code is available in a repository, you can directly get it from there:
from conans import ConanFile
class HelloConan(ConanFile):
name = "Hello"
version = "0.1"
settings = "os", "compiler", "build_type", "arch"
def source(self):
self.run("git clone https://github.com/memsharded/hello.git")
# You can also change branch, commit or whatever
# self.run("cd hello && git checkout 2fe5...")
#
# Or using the Git class:
# git = tools.Git(folder="hello")
# git.clone("https://github.com/memsharded/hello.git", "static_shared")
This will work, as long as git is in your current path (so in Win you probably want to run things in msysgit, cmder, etc). You can also use another VCS or direct download/unzip. For that purpose, we have provided some helpers, but you can use your own code or origin as well. This is a snippet of the conanfile of the Poco library:
from conans import ConanFile
from conans.tools import download, unzip, check_md5, check_sha1, check_sha256
import os
import shutil
class PocoConan(ConanFile):
name = "Poco"
version = "1.6.0"
def source(self):
zip_name = "poco-1.6.0-release.zip"
download("https://github.com/pocoproject/poco/archive/poco-1.6.0-release.zip", zip_name)
# check_md5(zip_name, "51e11f2c02a36689d6ed655b6fff9ec9")
# check_sha1(zip_name, "8d87812ce591ced8ce3a022beec1df1c8b2fac87")
# check_sha256(zip_name, "653f983c30974d292de58444626884bee84a2731989ff5a336b93a0fef168d79")
unzip(zip_name)
shutil.move("poco-poco-1.6.0-release", "poco")
os.unlink(zip_name)
The download, unzip utilities can be imported from conan, but you can also use your own code here
to retrieve source code from any origin. You can even create packages for pre-compiled libraries
you already have, even if you don’t have the source code. You can download the binaries, skip
the build()
method and define your package()
and package_info()
accordingly.
You can also use check_md5()
, check_sha1()
and check_sha256()
from the tools module to
verify that a package is downloaded correctly.
Note
It is very important to recall that the source()
method will be executed just once, and the source code will be shared for all the
package builds. So it is not a good idea to conditionally use settings or options to make changes or patches on the source code. Maybe
the only setting that makes sense is the OS self.settings.os
, if not doing cross-building, for example to retrieve different
sources:
def source(self):
if platform.system() == "Windows":
# download some Win source zip
else:
# download sources from Nix systems in a tgz
If you need to patch the source code or build scripts differently for different variants of your packages, you can do it in the
build()
method, which uses a different folder and source code copy for each variant.
build()
This method is used to build the source code of the recipe using the desired commands. You can use your command line tools to invoke your build system or any of the build helpers provided with Conan.
def build(self):
cmake = CMake(self)
self.run("cmake . %s" % (cmake.command_line))
self.run("cmake --build . %s" % cmake.build_config)
Build helpers
You can use these classes to prepare your build system’s command invocation:
CMake: Prepares the invocation of cmake command with your settings.
AutoToolsBuildEnvironment: If you are using configure/Makefile to build your project you can use this helper. Read more: Building with Autotools.
MSBuild: If you are using Visual Studio compiler directly to build your project you can use this helper MSBuild(). For lower level control, the VisualStudioBuildEnvironment can also be used: VisualStudioBuildEnvironment.
(Unit) Testing your library
We have seen how to run package tests with conan, but what if we want to run full unit tests on
our library before packaging, so that they are run for every build configuration?
Nothing special is required here. We can just launch the tests from the last command in our
build()
method:
def build(self):
cmake = CMake(self)
cmake.configure()
cmake.build()
# here you can run CTest, launch your binaries, etc
cmake.test()
package()
The actual creation of the package, once that it is built, is done in the package()
method. Using the self.copy()
method, artifacts
are copied from the build folder to the package folder.
The syntax of self.copy
inside package()
is as follows:
self.copy(pattern, dst="", src="", keep_path=True, symlinks=None, excludes=None, ignore_case=False)
- Parameters:
pattern (Required): A pattern following fnmatch syntax of the files you want to copy, from the build to the package folders. Typically something like
*.lib
or*.h
.src (Optional, Defaulted to
""
): The folder where you want to search the files in the build folder. If you know that your libraries when you build your package will be in build/lib, you will typically usebuild/lib
in this parameter. Leaving it empty means the root build folder in local cache.dst (Optional, Defaulted to
""
): Destination folder in the package. They will typically beinclude
for headers,lib
for libraries and so on, though you can use any convention you like. Leaving it empty means the root package folder in local cache.keep_path (Optional, Defaulted to
True
): Means if you want to keep the relative path when you copy the files from the src folder to the dst one. Typically headers are packaged with relative path.symlinks (Optional, Defaulted to
None
): Set it to True to activate symlink copying, like typical lib.so->lib.so.9.excludes (Optional, Defaulted to
None
): Single pattern or a tuple of patterns to be excluded from the copy. If a file matches both the include and the exclude pattern, it will be excluded.ignore_case (Optional, Defaulted to
False
): If enabled, it will do a case-insensitive pattern matching.
For example:
self.copy("*.h", "include", "build/include") #keep_path default is True
The final path in the package will be: include/mylib/path/header.h
, and as the include is usually added to the path, the includes
will be in the form: #include "mylib/path/header.h"
which is something desired.
keep_path=False
is something typically desired for libraries, both static and dynamic. Some compilers as MSVC, put them in paths as
Debug/x64/MyLib/Mylib.lib. Using this option, we could write:
self.copy("*.lib", "lib", "", keep_path=False)
And it will copy the lib to the package folder lib/Mylib.lib, which can be linked easily.
Note
If you are using CMake and you have an install target defined in your CMakeLists.txt, you might be able to reuse it for this
package()
method. Please check How to reuse cmake install for package() method.
This method copies files from build/source folder to the package folder depending on two situations:
Build folder and source folder are the same: Normally during conan create source folder content is copied to the build folder. In this situation
src
parameter ofself.copy()
will point to the build folder in the local cache.Build folder is different from source folder: When developing a package recipe and source and build folder are different (conan package . --source-folder=source --build-folder=build) or when no_copy_source is defined,
package()
method is called twice: One will copy from the source folder (src
parameter ofself.copy()
will point to the source folder), and the other will copy from the build folder (src
parameter ofself.copy()
will point to the build folder).
package_info()
cpp_info
Each package has to specify certain build information for its consumers. This can be done in the cpp_info
attribute within the
package_info()
method.
The cpp_info
attribute has the following properties you can assign/append to:
self.cpp_info.includedirs = ['include'] # Ordered list of include paths
self.cpp_info.libs = [] # The libs to link against
self.cpp_info.libdirs = ['lib'] # Directories where libraries can be found
self.cpp_info.resdirs = ['res'] # Directories where resources, data, etc can be found
self.cpp_info.bindirs = ['bin'] # Directories where executables and shared libs can be found
self.cpp_info.srcdirs = [] # Directories where sources can be found (debugging, reusing sources)
self.cpp_info.defines = [] # preprocessor definitions
self.cpp_info.cflags = [] # pure C flags
self.cpp_info.cxxflags = [] # C++ compilation flags
self.cpp_info.sharedlinkflags = [] # linker flags
self.cpp_info.exelinkflags = [] # linker flags
includedirs: List of relative paths (starting from the package root) of directories where headers can be found. By default it is initialized to
['include']
, and it is rarely changed.libs: Ordered list of libs the client should link against. Empty by default, it is common that different configurations produce different library names. For example:
def package_info(self):
if not self.settings.os == "Windows":
self.cpp_info.libs = ["libzmq-static.a"] if self.options.static else ["libzmq.so"]
else:
...
libdirs: List of relative paths (starting from the package root) of directories in which to find library object binaries (*.lib, *.a, *.so, *.dylib). By default it is initialized to
['lib']
, and it is rarely changed.resdirs: List of relative paths (starting from the package root) of directories in which to find resource files (images, xml, etc). By default it is initialized to
['res']
, and it is rarely changed.bindirs: List of relative paths (starting from the package root) of directories in which to find library runtime binaries (like Windows .dlls). By default it is initialized to
['bin']
, and it is rarely changed.srcdirs: List of relative paths (starting from the package root) of directories in which to find sources (like .c, .cpp). By default it is empty. It might be used to store sources (for later debugging of packages, or to reuse those sources building them in other packages too).
defines: Ordered list of preprocessor directives. It is common that the consumers have to specify some sort of defines in some cases, so that including the library headers matches the binaries:
cflags, cxxflags, sharedlinkflags, exelinkflags: List of flags that the consumer should activate for proper behavior. Usage of C++11 could be configured here, for example, although it is true that the consumer may want to do some flag processing to check if different dependencies are setting incompatible flags (c++11 after c++14).
if self.options.static:
if self.settings.compiler == "Visual Studio":
self.cpp_info.libs.append("ws2_32")
self.cpp_info.defines = ["ZMQ_STATIC"]
if not self.settings.os == "Windows":
self.cpp_info.cxxflags = ["-pthread"]
Note that due to the way that some build systems, like CMake, manage forward and back slashes, it might
be more robust passing flags for Visual Studio compiler with dash instead. Using "/NODEFAULTLIB:MSVCRT"
,
for example, might fail when using CMake targets mode, so the following is preferred and works both
in the global and targets mode of CMake:
def package_info(self):
self.cpp_info.exelinkflags = ["-NODEFAULTLIB:MSVCRT",
"-DEFAULTLIB:LIBCMT"]
If your recipe has requirements, you can access to your requirements cpp_info
as well using the deps_cpp_info
object.
class OtherConan(ConanFile):
name = "OtherLib"
version = "1.0"
requires = "MyLib/1.6.0@conan/stable"
def build(self):
self.output.warn(self.deps_cpp_info["MyLib"].libdirs)
Note
Please take into account that defining self.cpp_info.bindirs
directories, does not have any effect on system paths, PATH environment
variable, nor will be directly accessible by consumers. self.cpp_info
information is translated to build-systems information via
generators, for example for CMake, it will be a variable in conanbuildinfo.cmake
. If you want a package to make accessible its
executables to its consumers, you have to specify it with self.env_info
as described in env_info.
env_info
Each package can also define some environment variables that the package needs to be reused. It’s specially useful for
installer packages, to set the path with the “bin” folder of the packaged application. This can be done in
the env_info
attribute within the package_info()
method.
self.env_info.path.append("ANOTHER VALUE") # Append "ANOTHER VALUE" to the path variable
self.env_info.othervar = "OTHER VALUE" # Assign "OTHER VALUE" to the othervar variable
self.env_info.thirdvar.append("some value") # Every variable can be set or appended a new value
One of the most typical usages for the PATH environment variable, would be to add the current binary package directories to the path, so consumers can use those executables easily:
# assuming the binaries are in the "bin" subfolder
self.env_info.PATH.append(os.path.join(self.package_folder, "bin"))
The virtualenv generator will use the self.env_info
variables to prepare a script to
activate/deactivate a virtual environment. However, this could be directly done using the virtualrunenv generator.
They will be automatically applied before calling the consumer conanfile.py methods source()
, build()
, package()
and
imports()
.
If your recipe has requirements, you can access to your requirements env_info
as well using the deps_env_info
object.
class OtherConan(ConanFile):
name = "OtherLib"
version = "1.0"
requires = "MyLib/1.6.0@conan/stable"
def build(self):
self.output.warn(self.deps_env_info["MyLib"].othervar)
user_info
If you need to declare custom variables not related with C/C++ (cpp_info
) and the variables are not environment variables
(env_info
), you can use the self.user_info
object.
Currently only the cmake
, cmake_multi
and txt
generators supports user_info
variables.
class MyLibConan(ConanFile):
name = "MyLib"
version = "1.6.0"
# ...
def package_info(self):
self.user_info.var1 = 2
For the example above, in the cmake
and cmake_multi
generators, a variable CONAN_USER_MYLIB_var1
will be declared. If your
recipe has requirements, you can access to your requirements user_info
using the deps_user_info
object.
class OtherConan(ConanFile):
name = "OtherLib"
version = "1.0"
requires = "MyLib/1.6.0@conan/stable"
def build(self):
self.out.warn(self.deps_user_info["MyLib"].var1)
Important
Both env_info
and user_info
objects store information in a “key <-> value” form and the values are always considered strings.
This is done for serialization purposes to conanbuildinfo.txt files and to avoid the deserialization of complex structures. It is up to the consumer to convert the string to the expected type:
# In a dependency
self.user_info.jars="jar1.jar, jar2.jar, jar3.jar" # Use a string, not a list
...
# In the dependent conanfile
jars = self.deps_user_info["Pkg"].jars
jar_list = jars.replace(" ", "").split(",")
configure(), config_options()
If the package options and settings are related, and you want to configure either, you can do so in the configure()
and
config_options()
methods.
class MyLibConan(ConanFile):
name = "MyLib"
version = "2.5"
settings = "os", "compiler", "build_type", "arch"
options = {"static": [True, False],
"header_only": [True False]}
def configure(self):
# If header only, the compiler, etc, does not affect the package!
if self.options.header_only:
self.settings.clear()
self.options.remove("static")
The package has 2 options set, to be compiled as a static (as opposed to shared) library, and also not to involve any builds, because
header-only libraries will be used. In this case, the settings that would affect a normal build, and even the other option (static vs
shared) do not make sense, so we just clear them. That means, if someone consumes MyLib with the header_only=True
option, the package
downloaded and used will be the same, irrespective of the OS, compiler or architecture the consumer is building with.
You can also restrict the settings used deleting any specific one. For example, it is quite common
for C libraries to delete the libcxx
as your library does not depend on any C++ standard
library:
def configure(self):
del self.settings.compiler.libcxx
The most typical usage would be the one with configure()
while config_options()
should be used more sparingly. config_options()
is used to configure or constraint the available options in a package, before they are given a value. So when a value is tried to be
assigned it will raise an error. For example, let’s suppose that a certain package library cannot be built as shared library in Windows, it
can be done:
def config_options(self):
if self.settings.os == "Windows":
del self.options.shared
This will be executed before the actual assignment of options
(then, such options
values cannot be used inside this function), so
the command conan install -o Pkg:shared=True will raise an exception in Windows saying that shared
is not an option for such
package.
Invalid configuration
Conan allows the recipe creator to declare invalid configurations, those that are known not to work
with the library being packaged. There is an especial kind of exception that can be raised from
the configure()
method to state this situation: conans.errors.ConanInvalidConfiguration
. Here
it is an example of a recipe for a library that doesn’t support Windows operating system:
def configure(self):
if self.settings.os != "Windows":
raise ConanInvalidConfiguration("Library MyLib is only supported for Windows")
This exception will be propagated and Conan application will exit with the error code 6
.
requirements()
Besides the requires
field, more advanced requirement logic can be defined in the requirements()
optional method, using for example
values from the package settings
or options
:
def requirements(self):
if self.options.myoption:
self.requires("zlib/1.2@drl/testing")
else:
self.requires("opencv/2.2@drl/stable")
This is a powerful mechanism for handling conditional dependencies.
When you are inside the method, each call to self.requires()
will add the corresponding requirement to the current list of requirements.
It also has optional parameters that allow defining the special cases, as is shown below:
def requirements(self):
self.requires("zlib/1.2@drl/testing", private=True, override=False)
self.requires()
parameters:override (Optional, Defaulted to
False
): True means that this is not an actual requirement, but something to be passed upstream and override possible existing values.private (Optional, Defaulted to
False
): True means that this requirement will be somewhat embedded (like a static lib linked into a shared lib), so it is not required to link.
build_requirements()
Build requirements are requirements that are only installed and used when the package is built from sources. If there is an existing pre-compiled binary, then the build requirements for this package will not be retrieved.
This method is useful for defining conditional build requirements, for example:
class MyPkg(ConanFile):
def build_requirements(self):
if self.settings.os == "Windows":
self.build_requires("ToolWin/0.1@user/stable")
See also
system_requirements()
It is possible to install system-wide packages from conan. Just add a system_requirements()
method to your conanfile and specify what
you need there.
For a special use case you can use also conans.tools.os_info
object to detect the operating system, version and distribution (linux):
os_info.is_linux
: True if Linux.os_info.is_windows
: True if Windows.os_info.is_macos
: True if macOS.os_info.is_freebsd
: True if FreeBSD.os_info.is_solaris
: True if SunOS.os_info.os_version
: OS version.os_info.os_version_name
: Common name of the OS (Windows 7, Mountain Lion, Wheezy…).os_info.linux_distro
: Linux distribution name (None if not Linux).os_info.bash_path
: Returns the absolute path to a bash in the system.os_info.uname(options=None)
: Runs the “uname” command and returns the output. You can pass arguments with the options parameter.os_info.detect_windows_subsystem()
: Returns “MSYS”, “MSYS2”, “CYGWIN” or “WSL” if any of these Windows subsystems are detected.
You can also use SystemPackageTool
class, that will automatically invoke the right system package tool: apt, yum, pkg,
pkgutil, brew and pacman depending on the system we are running.
from conans.tools import os_info, SystemPackageTool
def system_requirements(self):
pack_name = None
if os_info.linux_distro == "ubuntu":
if os_info.os_version > "12":
pack_name = "package_name_in_ubuntu_10"
else:
pack_name = "package_name_in_ubuntu_12"
elif os_info.linux_distro == "fedora" or os_info.linux_distro == "centos":
pack_name = "package_name_in_fedora_and_centos"
elif os_info.is_macos:
pack_name = "package_name_in_macos"
elif os_info.is_freebsd:
pack_name = "package_name_in_freebsd"
elif os_info.is_solaris:
pack_name = "package_name_in_solaris"
if pack_name:
installer = SystemPackageTool()
installer.install(pack_name) # Install the package, will update the package database if pack_name isn't already installed
On Windows, there is no standard package manager, however choco can be invoked as an optional:
from conans.tools import os_info, SystemPackageTool, ChocolateyTool
def system_requirements(self):
if os_info.is_windows:
pack_name = "package_name_in_windows"
installer = SystemPackageTool(tool=ChocolateyTool()) # Invoke choco package manager to install the package
installer.install(pack_name)
SystemPackageTool
def SystemPackageTool(tool=None)
Available tool classes: AptTool, YumTool, BrewTool, PkgTool, PkgUtilTool, ChocolateyTool, PacManTool.
- Methods:
update(): Updates the system package manager database. It’s called automatically from the
install()
method by default.install(packages, update=True, force=False): Installs the
packages
(could be a list or a string). Ifupdate
is True it will executeupdate()
first if it’s needed. The packages won’t be installed if they are already installed at least offorce
parameter is set to True. Ifpackages
is a list the first available package will be picked (short-circuit like logical or). Note: This list of packages is intended for providing alternative names for the same package, to account for small variations of the name for the same package in different distros. To install different packages, one call toinstall()
per package is necessary.
The use of sudo
in the internals of the install()
and update()
methods is controlled by the CONAN_SYSREQUIRES_SUDO
environment variable, so if the users don’t need sudo permissions, it is easy to opt-in/out.
Conan will keep track of the execution of this method, so that it is not invoked again and again at every Conan command. The execution is done per package, since some packages of the same library might have different system dependencies. If you are sure that all your binary packages have the same system requirements, just add the following line to your method:
def system_requirements(self):
self.global_system_requirements=True
if ...
imports()
Importing files copies files from the local store to your project. This feature is handy for copying shared libraries (dylib in Mac, dll in Win) to the directory of your executable, so that you don’t have to mess with your PATH to run them. But there are other use cases:
Copy an executable to your project, so that it can be easily run. A good example is the Google’s protobuf code generator.
Copy package data to your project, like configuration, images, sounds… A good example is the OpenCV demo, in which face detection XML pattern files are required.
Importing files is also very convenient in order to redistribute your application, as many times you will just have to bundle your project’s bin folder.
A typical imports()
method for shared libs could be:
def imports(self):
self.copy("*.dll", "", "bin")
self.copy("*.dylib", "", "lib")
The self.copy()
method inside imports()
supports the following arguments:
def copy(pattern, dst="", src="", root_package=None, folder=False, ignore_case=False, excludes=None, keep_path=True)
- Parameters:
pattern (Required): An fnmatch file pattern of the files that should be copied.
dst (Optional, Defaulted to
""
): Destination local folder, with reference to current directory, to which the files will be copied.src (Optional, Defaulted to
""
): Source folder in which those files will be searched. This folder will be stripped from the dst parameter. E.g., lib/Debug/x86root_package (Optional, Defaulted to all packages in deps): An fnmatch pattern of the package name (“OpenCV”, “Boost”) from which files will be copied.
folder (Optional, Defaulted to
False
): If enabled, it will copy the files from the local cache to a subfolder named as the package containing the files. Useful to avoid conflicting imports of files with the same name (e.g. License).ignore_case (Optional, Defaulted to
False
): If enabled, it will do a case-insensitive pattern matching.excludes (Optional, Defaulted to
None
): Allows defining a list of patterns (even a single pattern) to be excluded from the copy, even if they match the mainpattern
.keep_path (Optional, Defaulted to
True
): Means if you want to keep the relative path when you copy the files from the src folder to the dst one. Useful to ignore (keep_path=False
) path of library.dll files in the package it is imported from.
Example to collect license files from dependencies:
def imports(self):
self.copy("license*", dst="licenses", folder=True, ignore_case=True)
If you want to be able to customize the output user directory to work with both the cmake
and cmake_multi
generators, then you can
do:
def imports(self):
dest = os.getenv("CONAN_IMPORT_PATH", "bin")
self.copy("*.dll", dst=dest, src="bin")
self.copy("*.dylib*", dst=dest, src="lib")
And then use, for example: conan install . -e CONAN_IMPORT_PATH=Release -g cmake_multi
When a conanfile recipe has an imports()
method and it builds from sources, it will do the following:
Before running
build()
it will executeimports()
in the build folder, copying dependencies artifactsRun the
build()
method, which could use such imported binaries.Remove the copied (imported) artifacts after
build()
is finished.
You can use the keep_imports attribute to keep the imported artifacts, and maybe repackage them.
package_id()
Creates a unique ID for the package. Default package ID is calculated using settings
, options
and requires
properties. When a
package creator specifies the values for any of those properties, it is telling that any value change will require a different binary
package.
However, sometimes a package creator would need to alter the default behavior, for example, to have only one binary package for several
different compiler versions. In that case you can set a custom self.info
object implementing this method and the package ID will be
computed with the given information:
def package_id(self):
v = Version(str(self.settings.compiler.version))
if self.settings.compiler == "gcc" and (v >= "4.5" and v < "5.0"):
self.info.settings.compiler.version = "GCC 4 between 4.5 and 5.0"
Please, check the section Defining Package ABI Compatibility to get more details.
self.info
This self.info
object stores the information that will be used to compute the package ID.
This object can be manipulated to reflect the information you want in the computation of the package ID. For example, you can delete any setting or option:
def package_id(self):
del self.info.settings.compiler
del self.info.options.shared
self.info.header_only()
The package will always be the same, irrespective of the OS, compiler or architecture the consumer is building with.
def package_id(self):
self.info.header_only()
self.info.vs_toolset_compatible() / self.info.vs_toolset_incompatible()
By default (vs_toolset_compatible()
mode) Conan will generate the same binary package when the compiler is Visual Studio and the
compiler.toolset
matches the specified compiler.version
. For example, if we install some packages specifying the following settings:
def package_id(self):
self.info.vs_toolset_compatible()
# self.info.vs_toolset_incompatible()
compiler="Visual Studio"
compiler.version=14
And then we install again specifying these settings:
compiler="Visual Studio"
compiler.version=15
compiler.toolset=v140
The compiler version is different, but Conan will not install a different package, because the used toolchain
in both cases are
considered the same. You can deactivate this default behavior using calling self.info.vs_toolset_incompatible()
.
This is the relation of Visual Studio versions and the compatible toolchain:
Visual Studio Version |
Compatible toolset |
---|---|
15 |
v141 |
14 |
v140 |
13 |
v120 |
12 |
v120 |
11 |
v110 |
10 |
v100 |
9 |
v90 |
8 |
v80 |
self.info.discard_build_settings() / self.info.include_build_settings()
By default (discard_build_settings()
) Conan will generate the same binary when you change the os_build
or arch_build
when the
os
and arch
are declared respectively. This is because os_build
represent the machine running Conan, so, for the consumer, the
only setting that matters is where the built software will run, not where is running the compilation. The same applies to arch_build
.
With self.info.include_build_settings()
, Conan will generate different packages when you change the os_build
or arch_build
.
def package_id(self):
self.info.discard_build_settings()
# self.info.include_build_settings()
self.info.default_std_matching() / self.info.default_std_non_matching()
By default (default_std_matching()
) Conan will detect the default C++ standard of your compiler to
not generate different binary packages.
For example, you already built some gcc > 6.1
packages, where the default std is gnu14
.
If you introduce the cppstd
setting in your recipes and specify the gnu14
value, Conan won’t generate
new packages, because it was already the default of your compiler.
With self.info.default_std_non_matching()
, Conan will generate different packages when you specify the cppstd
even if it matches with the default of the compiler being used:
def package_id(self):
self.info.default_std_non_matching()
# self.info.default_std_matching()
build_id()
In the general case, there is one build folder for each binary package, with the exact same hash/ID of the package. However this behavior can be changed, there are a couple of scenarios that this might be interesting:
You have a build script that generates several different configurations at once, like both debug and release artifacts, but you actually want to package and consume them separately. Same for different architectures or any other setting.
You build just one configuration (like release), but you want to create different binary packages for different consuming cases. For example, if you have created tests for the library in the build step, you might want to create two packages: one just containing the library for general usage, and another one also containing the tests. First package could be used as a reference and the other one as a tool to debug errors.
In both cases, if using different settings, the system will build twice (or more times) the same binaries, just to produce a different final
binary package. With the build_id()
method this logic can be changed. build_id()
will create a new package ID/hash for the build
folder, and you can define the logic you want in it. For example:
settings = "os", "compiler", "arch", "build_type"
def build_id(self):
self.info_build.settings.build_type = "Any"
So this recipe will generate a final different package for each debug/release configuration. But as the build_id()
will generate the
same ID for any build_type
, then just one folder and one build will be done. Such build should build both debug and release artifacts,
and then the package()
method should package them accordingly to the self.settings.build_type
value. Different builds will still be
executed if using different compilers or architectures. This method is basically an optimization of build time, avoiding multiple re-builds.
Other information like custom package options can also be changed:
def build_id(self):
self.info_build.options.myoption = 'MyValue' # any value possible
self.info_build.options.fullsource = 'Always'
If the build_id()
method does not modify the build_id
, and produce a different one than
the package_id
, then the standard behavior will be applied. Consider the following:
settings = "os", "compiler", "arch", "build_type"
def build_id(self):
if self.settings.os == "Windows":
self.info_build.settings.build_type = "Any"
This will only produce a build ID different if the package is for Windows. So the behavior
in any other OS will be the standard one, as if the build_id()
method was not defined:
the build folder will be wiped at each conan create command and a clean build will
be done.
deploy()
This method can be used in a conanfile.py to install in the system or user folder artifacts from packages.
def deploy(self):
self.copy("*.exe") # copy from current package
self.copy_deps("*.dll") # copy from dependencies
Where:
self.copy()
is theself.copy()
method executed inside package() method.self.copy_deps()
is the same asself.copy()
method inside imports() method.
Both methods allow the definition of absolute paths (to install in the system), in the dst
argument. By default, the dst
destination folder will be the current one.
The deploy()
method is designed to work on a package that is installed directly from its reference, as:
$ conan install Pkg/0.1@user/channel
> ...
> Pkg/0.1@user/testing deploy(): Copied 1 '.dll' files: mylib.dll
> Pkg/0.1@user/testing deploy(): Copied 1 '.exe' files: myexe.exe
All other packages and dependencies, even transitive dependencies of “Pkg/0.1@user/testing” will not be deployed, it is the responsibility of the installed package to deploy what it needs from its dependencies.