Skip to main content

System for managing development buildouts

Project description

Travis CI build report winbot build report

Buildout is a project designed to solve 2 problems:

  1. Application-centric assembly and deployment

    Assembly runs the gamut from stitching together libraries to create a running program, to production deployment configuration of applications, and associated systems and tools (e.g. run-control scripts, cron jobs, logs, service registration, etc.).

    Buildout might be confused with build tools like make or ant, but it is a little higher level and might invoke systems like make or ant to get its work done.

    Buildout might be confused with systems like puppet or chef, but it is more application focused. Systems like puppet or chef might use buildout to get their work done.

    Buildout is also somewhat Python-centric, even though it can be used to assemble and deploy non-python applications. It has some special features for assembling Python programs. It’s scripted with Python, unlike, say puppet or chef, which are scripted with Ruby.

  2. Repeatable assembly of programs from Python software distributions

    Buildout puts great effort toward making program assembly a highly repeatable process, whether in a very open-ended development mode, where dependency versions aren’t locked down, or in a deployment environment where dependency versions are fully specified. You should be able to check buildout into a VCS and later check it out. Two checkouts built at the same time in the same environment should always give the same result, regardless of their history. Among other things, after a buildout, all dependencies should be at the most recent version consistent with any version specifications expressed in the buildout.

    Buildout supports applications consisting of multiple programs, with different programs in an application free to use different versions of Python distributions. This is in contrast with a Python installation (real or virtual), where, for any given distribution, there can only be one installed.

To learn more about buildout, including how to use it, see http://buildout.org/.

Installation

There are a number of ways to install buildout. You can install it as you would any other package, using pip or easy_install. In this case, you’ll get a buildout command that you can use to build projects. To build a project, just use:

buildout

from a project directory.

Buildout’s (stubborn) philosophy, however, is that projects should be self-contained, and not require changes to a shared Python installation. To avoid changing a shared Python installation you can download a bootstrap script that, when run, will install buildout locally in your project.

The bootstrap script for buildout version 2 is at:

http://downloads.buildout.org/2/bootstrap.py

So, for example, to install buildout 2 in a project, you might:

wget http://downloads.buildout.org/2/bootstrap.py
python bootstrap.py

Then to build your project, you can just run:

bin/buildout

from the project directory.

The bootstrap script is often checked into version control.

buildout 2 is somewhat backward-incompatible with version 1. Most projects will probably work fine with either. If you need to keep using version 1, however, specify a version requirement when you use pip or easy_install, or use the version 1 bootstrap script at:

http://downloads.buildout.org/1/bootstrap.py

Below, you’ll find doctest-based documentation. It was an experiment in reusing tests as documentation. The experiment didn’t go that well, but there may be details below that aren’t easy to find on buildout.org yet.

doctest-based Documentation

Buildouts

The word “buildout” refers to a description of a set of parts and the software to create and assemble them. It is often used informally to refer to an installed system based on a buildout definition. For example, if we are creating an application named “Foo”, then “the Foo buildout” is the collection of configuration and application-specific software that allows an instance of the application to be created. We may refer to such an instance of the application informally as “a Foo buildout”.

This document describes how to define buildouts using buildout configuration files and recipes. There are three ways to set up the buildout software and create a buildout instance:

  1. Install the zc.buildout egg with easy_install and use the buildout script installed in a Python scripts area.

  2. Use the buildout bootstrap script to create a buildout that includes both the setuptools and zc.buildout eggs. This allows you to use the buildout software without modifying a Python install. The buildout script is installed into your buildout local scripts area.

  3. Use a buildout command from an already installed buildout to bootstrap a new buildout. (See the section on bootstraping later in this document.)

Often, a software project will be managed in a software repository, such as a subversion repository, that includes some software source directories, buildout configuration files, and a copy of the buildout bootstrap script. To work on the project, one would check out the project from the repository and run the bootstrap script which installs setuptools and zc.buildout into the checkout as well as any parts defined.

We have a sample buildout that we created using the bootstrap command of an existing buildout (method 3 above). It has the absolute minimum information. We have bin, develop-eggs, eggs and parts directories, and a configuration file:

>>> ls(sample_buildout)
d  bin
-  buildout.cfg
d  develop-eggs
d  eggs
d  parts

The bin directory contains scripts.

>>> ls(sample_buildout, 'bin')
-  buildout
>>> ls(sample_buildout, 'eggs')
-  setuptools-0.7-py3.3.egg
-  zc.buildout.egg-link

The develop-eggs and parts directories are initially empty:

>>> ls(sample_buildout, 'develop-eggs')
>>> ls(sample_buildout, 'parts')

The develop-eggs directory holds egg links for software being developed in the buildout. We separate develop-eggs and other eggs to allow eggs directories to be shared across multiple buildouts. For example, a common developer technique is to define a common eggs directory in their home that all non-develop eggs are stored in. This allows larger buildouts to be set up much more quickly and saves disk space.

The parts directory provides an area where recipes can install part data. For example, if we built a custom Python, we would install it in the part directory. Part data is stored in a sub-directory of the parts directory with the same name as the part.

Buildouts are defined using configuration files. These are in the format defined by the Python ConfigParser module, with extensions that we’ll describe later. By default, when a buildout is run, it looks for the file buildout.cfg in the directory where the buildout is run.

The minimal configuration file has a buildout section that defines no parts:

>>> cat(sample_buildout, 'buildout.cfg')
[buildout]
parts =

A part is simply something to be created by a buildout. It can be almost anything, such as a Python package, a program, a directory, or even a configuration file.

Recipes

A part is created by a recipe. Recipes are always installed as Python eggs. They can be downloaded from a package server, such as the Python Package Index, or they can be developed as part of a project using a “develop” egg.

A develop egg is a special kind of egg that gets installed as an “egg link” that contains the name of a source directory. Develop eggs don’t have to be packaged for distribution to be used and can be modified in place, which is especially useful while they are being developed.

Let’s create a recipe as part of the sample project. We’ll create a recipe for creating directories. First, we’ll create a recipes source directory for our local recipes:

>>> mkdir(sample_buildout, 'recipes')

and then we’ll create a source file for our mkdir recipe:

>>> write(sample_buildout, 'recipes', 'mkdir.py',
... """
... import logging, os, zc.buildout
...
... class Mkdir:
...
...     def __init__(self, buildout, name, options):
...         self.name, self.options = name, options
...         options['path'] = os.path.join(
...                               buildout['buildout']['directory'],
...                               options['path'],
...                               )
...         if not os.path.isdir(os.path.dirname(options['path'])):
...             logging.getLogger(self.name).error(
...                 'Cannot create %s. %s is not a directory.',
...                 options['path'], os.path.dirname(options['path']))
...             raise zc.buildout.UserError('Invalid Path')
...
...
...     def install(self):
...         path = self.options['path']
...         logging.getLogger(self.name).info(
...             'Creating directory %s', os.path.basename(path))
...         os.mkdir(path)
...         return path
...
...     def update(self):
...         pass
... """)

Currently, recipes must define 3 methods:

  • a constructor,

  • an install method, and

  • an update method.

The constructor is responsible for updating a parts options to reflect data read from other sections. The buildout system keeps track of whether a part specification has changed. A part specification has changed if it’s options, after adjusting for data read from other sections, has changed, or if the recipe has changed. Only the options for the part are considered. If data are read from other sections, then that information has to be reflected in the parts options. In the Mkdir example, the given path is interpreted relative to the buildout directory, and data from the buildout directory is read. The path option is updated to reflect this. If the directory option was changed in the buildout sections, we would know to update parts created using the mkdir recipe using relative path names.

When buildout is run, it saves configuration data for installed parts in a file named “.installed.cfg”. In subsequent runs, it compares part-configuration data stored in the .installed.cfg file and the part-configuration data loaded from the configuration files as modified by recipe constructors to decide if the configuration of a part has changed. If the configuration has changed, or if the recipe has changed, then the part is uninstalled and reinstalled. The buildout only looks at the part’s options, so any data used to configure the part needs to be reflected in the part’s options. It is the job of a recipe constructor to make sure that the options include all relevant data.

Of course, parts are also uninstalled if they are no-longer used.

The recipe defines a constructor that takes a buildout object, a part name, and an options dictionary. It saves them in instance attributes. If the path is relative, we’ll interpret it as relative to the buildout directory. The buildout object passed in is a mapping from section name to a mapping of options for that section. The buildout directory is available as the directory option of the buildout section. We normalize the path and save it back into the options directory.

The install method is responsible for creating the part. In this case, we need the path of the directory to create. We’ll use a path option from our options dictionary. The install method logs what it’s doing using the Python logging call. We return the path that we installed. If the part is uninstalled or reinstalled, then the path returned will be removed by the buildout machinery. A recipe install method is expected to return a string, or an iterable of strings containing paths to be removed if a part is uninstalled. For most recipes, this is all of the uninstall support needed. For more complex uninstallation scenarios use Uninstall recipes.

The update method is responsible for updating an already installed part. An empty method is often provided, as in this example, if parts can’t be updated. An update method can return None, a string, or an iterable of strings. If a string or iterable of strings is returned, then the saved list of paths to be uninstalled is updated with the new information by adding any new files returned by the update method.

We need to provide packaging information so that our recipe can be installed as a develop egg. The minimum information we need to specify is a name. For recipes, we also need to define the names of the recipe classes as entry points. Packaging information is provided via a setup.py script:

>>> write(sample_buildout, 'recipes', 'setup.py',
... """
... from setuptools import setup
...
... setup(
...     name = "recipes",
...     entry_points = {'zc.buildout': ['mkdir = mkdir:Mkdir']},
...     )
... """)

Our setup script defines an entry point. Entry points provide a way for an egg to define the services it provides. Here we’ve said that we define a zc.buildout entry point named mkdir. Recipe classes must be exposed as entry points in the zc.buildout group. we give entry points names within the group.

We also need a README.txt for our recipes to avoid an annoying warning from distutils, on which setuptools and zc.buildout are based:

>>> write(sample_buildout, 'recipes', 'README.txt', " ")

Now let’s update our buildout.cfg:

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts = data-dir
...
... [data-dir]
... recipe = recipes:mkdir
... path = mystuff
... """)

Let’s go through the changes one by one:

develop = recipes

This tells the buildout to install a development egg for our recipes. Any number of paths can be listed. The paths can be relative or absolute. If relative, they are treated as relative to the buildout directory. They can be directory or file paths. If a file path is given, it should point to a Python setup script. If a directory path is given, it should point to a directory containing a setup.py file. Development eggs are installed before building any parts, as they may provide locally-defined recipes needed by the parts.

parts = data-dir

Here we’ve named a part to be “built”. We can use any name we want except that different part names must be unique and recipes will often use the part name to decide what to do.

[data-dir]
recipe = recipes:mkdir
path = mystuff

When we name a part, we also create a section of the same name that contains part data. In this section, we’ll define the recipe to be used to install the part. In this case, we also specify the path to be created.

Let’s run the buildout. We do so by running the build script in the buildout:

>>> import os
>>> os.chdir(sample_buildout)
>>> buildout = os.path.join(sample_buildout, 'bin', 'buildout')
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Installing data-dir.
data-dir: Creating directory mystuff

We see that the recipe created the directory, as expected:

>>> ls(sample_buildout)
-  .installed.cfg
d  bin
-  buildout.cfg
d  develop-eggs
d  eggs
d  mystuff
d  parts
d  recipes

In addition, .installed.cfg has been created containing information about the part we installed:

>>> cat(sample_buildout, '.installed.cfg')
[buildout]
installed_develop_eggs = /sample-buildout/develop-eggs/recipes.egg-link
parts = data-dir
<BLANKLINE>
[data-dir]
__buildout_installed__ = /sample-buildout/mystuff
__buildout_signature__ = recipes-c7vHV6ekIDUPy/7fjAaYjg==
path = /sample-buildout/mystuff
recipe = recipes:mkdir

Note that the directory we installed is included in .installed.cfg. In addition, the path option includes the actual destination directory.

If we change the name of the directory in the configuration file, we’ll see that the directory gets removed and recreated:

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts = data-dir
...
... [data-dir]
... recipe = recipes:mkdir
... path = mydata
... """)
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Uninstalling data-dir.
Installing data-dir.
data-dir: Creating directory mydata
>>> ls(sample_buildout)
-  .installed.cfg
d  bin
-  buildout.cfg
d  develop-eggs
d  eggs
d  mydata
d  parts
d  recipes

If any of the files or directories created by a recipe are removed, the part will be reinstalled:

>>> rmdir(sample_buildout, 'mydata')
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Uninstalling data-dir.
Installing data-dir.
data-dir: Creating directory mydata

Error reporting

If a user makes an error, an error needs to be printed and work needs to stop. This is accomplished by logging a detailed error message and then raising a (or an instance of a subclass of a) zc.buildout.UserError exception. Raising an error other than a UserError still displays the error, but labels it as a bug in the buildout software or recipe. In the sample above, of someone gives a non-existent directory to create the directory in:

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts = data-dir
...
... [data-dir]
... recipe = recipes:mkdir
... path = /xxx/mydata
... """)

We’ll get a user error, not a traceback.

>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
data-dir: Cannot create /xxx/mydata. /xxx is not a directory.
While:
  Installing.
  Getting section data-dir.
  Initializing section data-dir.
Error: Invalid Path

Recipe Error Handling

If an error occurs during installation, it is up to the recipe to clean up any system side effects, such as files created. Let’s update the mkdir recipe to support multiple paths:

>>> write(sample_buildout, 'recipes', 'mkdir.py',
... """
... import logging, os, zc.buildout
...
... class Mkdir:
...
...     def __init__(self, buildout, name, options):
...         self.name, self.options = name, options
...
...         # Normalize paths and check that their parent
...         # directories exist:
...         paths = []
...         for path in options['path'].split():
...             path = os.path.join(buildout['buildout']['directory'], path)
...             if not os.path.isdir(os.path.dirname(path)):
...                 logging.getLogger(self.name).error(
...                     'Cannot create %s. %s is not a directory.',
...                     options['path'], os.path.dirname(options['path']))
...                 raise zc.buildout.UserError('Invalid Path')
...             paths.append(path)
...         options['path'] = ' '.join(paths)
...
...     def install(self):
...         paths = self.options['path'].split()
...         for path in paths:
...             logging.getLogger(self.name).info(
...                 'Creating directory %s', os.path.basename(path))
...             os.mkdir(path)
...         return paths
...
...     def update(self):
...         pass
... """)
>>> clean_up_pyc(sample_buildout, 'recipes', 'mkdir.py')

If there is an error creating a path, the install method will exit and leave previously created paths in place:

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts = data-dir
...
... [data-dir]
... recipe = recipes:mkdir
... path = foo bin
... """)
>>> print_(system(buildout)) # doctest: +ELLIPSIS
Develop: '/sample-buildout/recipes'
Uninstalling data-dir.
Installing data-dir.
data-dir: Creating directory foo
data-dir: Creating directory bin
While:
  Installing data-dir.
<BLANKLINE>
An internal error occurred due to a bug in either zc.buildout or in a
recipe being used:
Traceback (most recent call last):
... exists...

We meant to create a directory bins, but typed bin. Now foo was left behind.

>>> os.path.exists('foo')
True

If we fix the typo:

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts = data-dir
...
... [data-dir]
... recipe = recipes:mkdir
... path = foo bins
... """)
>>> print_(system(buildout)) # doctest: +ELLIPSIS
Develop: '/sample-buildout/recipes'
Installing data-dir.
data-dir: Creating directory foo
While:
  Installing data-dir.
<BLANKLINE>
An internal error occurred due to a bug in either zc.buildout or in a
recipe being used:
Traceback (most recent call last):
... exists...

Now they fail because foo exists, because it was left behind.

>>> remove('foo')

Let’s fix the recipe:

>>> write(sample_buildout, 'recipes', 'mkdir.py',
... """
... import logging, os, zc.buildout, sys
...
... class Mkdir:
...
...     def __init__(self, buildout, name, options):
...         self.name, self.options = name, options
...
...         # Normalize paths and check that their parent
...         # directories exist:
...         paths = []
...         for path in options['path'].split():
...             path = os.path.join(buildout['buildout']['directory'], path)
...             if not os.path.isdir(os.path.dirname(path)):
...                 logging.getLogger(self.name).error(
...                     'Cannot create %s. %s is not a directory.',
...                     options['path'], os.path.dirname(options['path']))
...                 raise zc.buildout.UserError('Invalid Path')
...             paths.append(path)
...         options['path'] = ' '.join(paths)
...
...     def install(self):
...         paths = self.options['path'].split()
...         created = []
...         try:
...             for path in paths:
...                 logging.getLogger(self.name).info(
...                     'Creating directory %s', os.path.basename(path))
...                 os.mkdir(path)
...                 created.append(path)
...         except Exception:
...             for d in created:
...                 os.rmdir(d)
...                 assert not os.path.exists(d)
...                 logging.getLogger(self.name).info(
...                     'Removed %s due to error',
...                      os.path.basename(d))
...             sys.stderr.flush()
...             sys.stdout.flush()
...             raise
...
...         return paths
...
...     def update(self):
...         pass
... """)
>>> clean_up_pyc(sample_buildout, 'recipes', 'mkdir.py')

And put back the typo:

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts = data-dir
...
... [data-dir]
... recipe = recipes:mkdir
... path = foo bin
... """)

When we rerun the buildout:

>>> print_(system(buildout)) # doctest: +ELLIPSIS
Develop: '/sample-buildout/recipes'
Installing data-dir.
data-dir: Creating directory foo
data-dir: Creating directory bin
data-dir: Removed foo due to error
While:
  Installing data-dir.
<BLANKLINE>
An internal error occurred due to a bug in either zc.buildout or in a
recipe being used:
Traceback (most recent call last):
... exists...

we get the same error, but we don’t get the directory left behind:

>>> os.path.exists('foo')
False

It’s critical that recipes clean up partial effects when errors occur. Because recipes most commonly create files and directories, buildout provides a helper API for removing created files when an error occurs. Option objects have a created method that can be called to record files as they are created. If the install or update method returns with an error, then any registered paths are removed automatically. The method returns the files registered and can be used to return the files created. Let’s use this API to simplify the recipe:

>>> write(sample_buildout, 'recipes', 'mkdir.py',
... """
... import logging, os, zc.buildout
...
... class Mkdir:
...
...     def __init__(self, buildout, name, options):
...         self.name, self.options = name, options
...
...         # Normalize paths and check that their parent
...         # directories exist:
...         paths = []
...         for path in options['path'].split():
...             path = os.path.join(buildout['buildout']['directory'], path)
...             if not os.path.isdir(os.path.dirname(path)):
...                 logging.getLogger(self.name).error(
...                     'Cannot create %s. %s is not a directory.',
...                     options['path'], os.path.dirname(options['path']))
...                 raise zc.buildout.UserError('Invalid Path')
...             paths.append(path)
...         options['path'] = ' '.join(paths)
...
...     def install(self):
...         paths = self.options['path'].split()
...         for path in paths:
...             logging.getLogger(self.name).info(
...                 'Creating directory %s', os.path.basename(path))
...             os.mkdir(path)
...             self.options.created(path)
...
...         return self.options.created()
...
...     def update(self):
...         pass
... """)
>>> clean_up_pyc(sample_buildout, 'recipes', 'mkdir.py')

We returned by calling created, taking advantage of the fact that it returns the registered paths. We did this for illustrative purposes. It would be simpler just to return the paths as before.

If we rerun the buildout, again, we’ll get the error and no directories will be created:

>>> print_(system(buildout)) # doctest: +ELLIPSIS
Develop: '/sample-buildout/recipes'
Installing data-dir.
data-dir: Creating directory foo
data-dir: Creating directory bin
While:
  Installing data-dir.
<BLANKLINE>
An internal error occurred due to a bug in either zc.buildout or in a
recipe being used:
Traceback (most recent call last):
... exists...
>>> os.path.exists('foo')
False

Now, we’ll fix the typo again and we’ll get the directories we expect:

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts = data-dir
...
... [data-dir]
... recipe = recipes:mkdir
... path = foo bins
... """)
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Installing data-dir.
data-dir: Creating directory foo
data-dir: Creating directory bins
>>> os.path.exists('foo')
True
>>> os.path.exists('bins')
True

Configuration file syntax

A buildout configuration file consists of a sequence of sections. A section has a section header followed by 0 or more section options. (Buildout configuration files may be viewed as a variation on INI files.)

A section header consists of a section name enclosed in square braces. A section name consists of one or more non-whitespace characters other than square braces (‘[’, ‘]’), curly braces (‘{’, ‘}’), colons (‘:’) or equal signs (‘=’). Whitespace surrounding section names is ignored.

A section header can optionally have a condition expression separated by a colon. See Conditional sections.

Options consist of option names, followed by optional space or tab characters, an optional plus or minus sign and an equal signs and values. An option value may be spread over multiple lines as long as the lines after the first start with a whitespace character. An option name consists of one or more non-whitespace characters other than equal signs, square braces (“[”, “]”), curly braces (“{”, “}”), plus signs or colons (“:”). The option name ‘<’ is reserved. An option’s data consists of the characters following the equal sign on the start line, plus the continuation lines.

Option values have extra whitespace stripped. How this is done depends on whether the value has non-whitespace characterts on the first line. If an option value has non-whitespace characters on the first line, then each line is stripped and blank lines are removed. For example, in:

[foo]
bar = 1
baz = a
      b

      c

The value of of bar is '1' and the value of baz is 'a\nb\nc'.

If the first line of an option doesn’t contain whitespace, then the value is dedented (with textwrap.dedent), trailing spaces in lines are removed, and leading and trailing blank lines are removed. For example, in:

[foo]
bar =
baz =

  a
    b

  c

The value of bar is '', and the value of baz is 'a\n b\n\nc'.

Lines starting with ‘#’ or ‘;’ characters are comments. Comments can also be placed after the closing square bracket (‘]’) in a section header.

Buildout configuration data are Python strings, which are bytes in Python 2 and unicode in Python 3.

Sections and options within sections may be repeated. Multiple occurrences of of a section are treated as if they were concatenated. The last option value for a given name in a section overrides previous values.

In addition top the syntactic details above:

  • option names are case sensitive

  • option values can use a substitution syntax, described below, to refer to option values in specific sections.

  • option values can be appended or removed using the - and + operators.

Annotated sections

When used with the annotate command, buildout displays annotated sections. All sections are displayed, sorted alphabetically. For each section, all key-value pairs are displayed, sorted alphabetically, along with the origin of the value (file name or COMPUTED_VALUE, DEFAULT_VALUE, COMMAND_LINE_VALUE).

>>> print_(system(buildout+ ' annotate'), end='')
... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE
<BLANKLINE>
Annotated sections
==================
<BLANKLINE>
[buildout]
allow-hosts= *
    DEFAULT_VALUE
allow-picked-versions= true
    DEFAULT_VALUE
bin-directory= bin
    DEFAULT_VALUE
develop= recipes
    /sample-buildout/buildout.cfg
develop-eggs-directory= develop-eggs
    DEFAULT_VALUE
directory= /sample-buildout
    COMPUTED_VALUE
eggs-directory= eggs
    DEFAULT_VALUE
executable= ...
    DEFAULT_VALUE
find-links=
    DEFAULT_VALUE
install-from-cache= false
    DEFAULT_VALUE
installed= .installed.cfg
    DEFAULT_VALUE
log-format=
    DEFAULT_VALUE
log-level= INFO
    DEFAULT_VALUE
newest= true
    DEFAULT_VALUE
offline= false
    DEFAULT_VALUE
parts= data-dir
    /sample-buildout/buildout.cfg
parts-directory= parts
    DEFAULT_VALUE
prefer-final= true
    DEFAULT_VALUE
python= buildout
    DEFAULT_VALUE
show-picked-versions= false
    DEFAULT_VALUE
socket-timeout=
    DEFAULT_VALUE
update-versions-file=
    DEFAULT_VALUE
use-dependency-links= true
    DEFAULT_VALUE
versions= versions
    DEFAULT_VALUE
<BLANKLINE>
[data-dir]
path= foo bins
    /sample-buildout/buildout.cfg
recipe= recipes:mkdir
    /sample-buildout/buildout.cfg
<BLANKLINE>
[versions]
zc.buildout= >=1.99
    DEFAULT_VALUE
zc.recipe.egg= >=1.99
    DEFAULT_VALUE
<BLANKLINE>

Variable substitutions

Buildout configuration files support variable substitution. To illustrate this, we’ll create an debug recipe to allow us to see interactions with the buildout:

>>> write(sample_buildout, 'recipes', 'debug.py',
... """
... import sys
... class Debug:
...
...     def __init__(self, buildout, name, options):
...         self.buildout = buildout
...         self.name = name
...         self.options = options
...
...     def install(self):
...         for option, value in sorted(self.options.items()):
...             sys.stdout.write('%s %s\\n' % (option, value))
...         return ()
...
...     update = install
... """)

This recipe doesn’t actually create anything. The install method doesn’t return anything, because it didn’t create any files or directories.

We also have to update our setup script:

>>> write(sample_buildout, 'recipes', 'setup.py',
... """
... from setuptools import setup
... entry_points = (
... '''
... [zc.buildout]
... mkdir = mkdir:Mkdir
... debug = debug:Debug
... ''')
... setup(name="recipes", entry_points=entry_points)
... """)

We’ve rearranged the script a bit to make the entry points easier to edit. In particular, entry points are now defined as a configuration string, rather than a dictionary.

Let’s update our configuration to provide variable substitution examples:

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts = data-dir debug
... log-level = INFO
...
... [debug]
... recipe = recipes:debug
... File-1 = ${data-dir:path}/file
... File-2 = ${debug:File-1}/log
...
... [data-dir]
... recipe = recipes:mkdir
... path = mydata
... """)

We used a string-template substitution for File-1 and File-2. This type of substitution uses the string.Template syntax. Names substituted are qualified option names, consisting of a section name and option name joined by a colon.

Now, if we run the buildout, we’ll see the options with the values substituted.

>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Uninstalling data-dir.
Installing data-dir.
data-dir: Creating directory mydata
Installing debug.
File-1 /sample-buildout/mydata/file
File-2 /sample-buildout/mydata/file/log
recipe recipes:debug

Note that the substitution of the data-dir path option reflects the update to the option performed by the mkdir recipe.

It might seem surprising that mydata was created again. This is because we changed our recipes package by adding the debug module. The buildout system didn’t know if this module could effect the mkdir recipe, so it assumed it could and reinstalled mydata. If we rerun the buildout:

>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Updating data-dir.
Updating debug.
File-1 /sample-buildout/mydata/file
File-2 /sample-buildout/mydata/file/log
recipe recipes:debug

We can see that mydata was not recreated.

Note that, in this case, we didn’t specify a log level, so we didn’t get output about what the buildout was doing.

Section and option names in variable substitutions are only allowed to contain alphanumeric characters, hyphens, periods and spaces. This restriction might be relaxed in future releases.

We can omit the section name in a variable substitution to refer to the current section. We can also use the special option, _buildout_section_name_ to get the current section name.

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts = data-dir debug
... log-level = INFO
...
... [debug]
... recipe = recipes:debug
... File-1 = ${data-dir:path}/file
... File-2 = ${:File-1}/log
... my_name = ${:_buildout_section_name_}
...
... [data-dir]
... recipe = recipes:mkdir
... path = mydata
... """)
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Uninstalling debug.
Updating data-dir.
Installing debug.
File-1 /sample-buildout/mydata/file
File-2 /sample-buildout/mydata/file/log
my_name debug
recipe recipes:debug

Automatic part selection and ordering

When a section with a recipe is referred to, either through variable substitution or by an initializing recipe, the section is treated as a part and added to the part list before the referencing part. For example, we can leave data-dir out of the parts list:

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts = debug
... log-level = INFO
...
... [debug]
... recipe = recipes:debug
... File-1 = ${data-dir:path}/file
... File-2 = ${debug:File-1}/log
...
... [data-dir]
... recipe = recipes:mkdir
... path = mydata
... """)

It will still be treated as a part:

>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Uninstalling debug.
Updating data-dir.
Installing debug.
File-1 /sample-buildout/mydata/file
File-2 /sample-buildout/mydata/file/log
recipe recipes:debug
>>> cat('.installed.cfg') # doctest: +ELLIPSIS
[buildout]
installed_develop_eggs = /sample-buildout/develop-eggs/recipes.egg-link
parts = data-dir debug
...

Note that the data-dir part is included before the debug part, because the debug part refers to the data-dir part. Even if we list the data-dir part after the debug part, it will be included before:

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts = debug data-dir
... log-level = INFO
...
... [debug]
... recipe = recipes:debug
... File-1 = ${data-dir:path}/file
... File-2 = ${debug:File-1}/log
...
... [data-dir]
... recipe = recipes:mkdir
... path = mydata
... """)

It will still be treated as a part:

>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Updating data-dir.
Updating debug.
File-1 /sample-buildout/mydata/file
File-2 /sample-buildout/mydata/file/log
recipe recipes:debug
>>> cat('.installed.cfg') # doctest: +ELLIPSIS
[buildout]
installed_develop_eggs = /sample-buildout/develop-eggs/recipes.egg-link
parts = data-dir debug
...

Extending sections (macros)

A section (other than the buildout section) can extend one or more other sections using the < option. Options from the referenced sections are copied to the referring section before variable substitution. This, together with the ability to refer to variables of the current section allows sections to be used as macros.

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts = myfiles
... log-level = INFO
...
... [debug]
... recipe = recipes:debug
...
... [with_file1]
... <= debug
... file1 = ${:path}/file1
... color = red
...
... [with_file2]
... <= debug
... file2 = ${:path}/file2
... color = blue
...
... [myfiles]
... <= with_file1
...    with_file2
... path = mydata
... """)
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Uninstalling debug.
Uninstalling data-dir.
Installing myfiles.
color blue
file1 mydata/file1
file2 mydata/file2
path mydata
recipe recipes:debug

In this example, the debug, with_file1 and with_file2 sections act as macros. In particular, the variable substitutions are performed relative to the myfiles section.

Conditional sections

Sometimes, you need different configuration in different environments (different operating systems, or different versions of Python). To make this easier, you can define environment-specific options by providing conditional sections:

[ctl]
suffix =

[ctl:windows]
suffix = .bat

In this tiny example, we’ve defined a ctl:suffix option that’s .bat on Windows and an empty string elsewhere.

A conditional section has a colon and then a Python expression after the name. If the Python expression result is true, the section options from the section are included. If the value is false, the section is ignored.

Some things to note:

  • If there is no exception, then options from the section are included.

  • Sections and options can be repeated. If an option is repeated, the last value is used. In the example above, on Windows, the second suffix option overrides the first. If the order of the sections was reversed, the conditional section would have no effect.

In addition to the normal built-ins, the expression has access to global variable that make common cases short and description as shown above:

sys

the sys module

os

the os module

platform

the platform module

re

The re module

python2

We’re running Python 2

python3

We’re running Python 3

python26

We’re running Python 2.6

python27

We’re running Python 2.7

python32

We’re running Python 3.2

python33

We’re running Python 3.3

sys_version

sys.version.lower()

pypy

We’re running PyPy

jython

We’re running Jython

iron

We’re running Iron Python

cpython

We’re not running PyPy, Jython, or Iron Python

sys_platform

str(sys.platform).lower()

linux

We’re running on linux

windows

We’re running on Windows

cygwin

We’re running on cygwin

solaris

We’re running on solaris

macosx

We’re running on Mac OS X

posix

We’re running on a POSIX-compatible system

bits32

We’re running on a 32-bit system.

bits64

We’re running on a 64-bit system.

little_endian

We’re running on a little-endian system

big_endian

We’re running on a little-endian system

Expressions must not contain either the # or the ; character.

Adding and removing options

We can append and remove values to an option by using the + and - operators.

This is illustrated below; first we define a base configuration.

>>> write(sample_buildout, 'base.cfg',
... """
... [buildout]
... parts = part1 part2 part3
...
... [part1]
... recipe =
... option = a1 a2
...
... [part2]
... recipe =
... option = b1 b2 b3 b4
...
... [part3]
... recipe =
... option = c1 c2
...
... [part4]
... recipe =
... option = d2
...     d3
...     d5
...
... """)

Extending this configuration, we can “adjust” the values set in the base configuration file.

>>> write(sample_buildout, 'extension1.cfg',
... """
... [buildout]
... extends = base.cfg
...
... # appending values
... [part1]
... option += a3 a4
...
... # removing values
... [part2]
... option -= b1 b2
...
... # alt. spelling
... [part3]
... option+=c3 c4 c5
...
... # combining both adding and removing
... [part4]
... option += d1
...      d4
... option -= d5
...
... # normal assignment
... [part5]
... option = h1 h2
...
... """)

An additional extension.

>>> write(sample_buildout, 'extension2.cfg',
... """
... [buildout]
... extends = extension1.cfg
...
... # appending values
... [part1]
... option += a5
...
... # removing values
... [part2]
... option -= b1 b2 b3
...
... """)

To verify that the options are adjusted correctly, we’ll set up an extension that prints out the options.

>>> mkdir(sample_buildout, 'demo')
>>> write(sample_buildout, 'demo', 'demo.py',
... """
... import sys
... def ext(buildout):
...     sys.stdout.write(str(
...         [part['option'] for name, part in sorted(buildout.items())
...          if name.startswith('part')])+'\\n')
... """)
>>> write(sample_buildout, 'demo', 'setup.py',
... """
... from setuptools import setup
...
... setup(
...     name="demo",
...     entry_points={'zc.buildout.extension': ['ext = demo:ext']},
...     )
... """)

Set up a buildout configuration for this extension.

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = demo
... parts =
... """)
>>> os.chdir(sample_buildout)
>>> print_(system(os.path.join(sample_buildout, 'bin', 'buildout')), end='')
... # doctest: +ELLIPSIS
Develop: '/sample-buildout/demo'...

Verify option values.

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = demo
... extensions = demo
... extends = extension2.cfg
... """)
>>> print_(system(os.path.join('bin', 'buildout')), end='')
['a1 a2/na3 a4/na5', 'b1 b2 b3 b4', 'c1 c2/nc3 c4 c5', 'd2/nd3/nd1/nd4', 'h1 h2']
Develop: '/sample-buildout/demo'

Annotated sections output shows which files are responsible for which operations.

>>> print_(system(os.path.join('bin', 'buildout') + ' annotate'), end='')
... # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE
<BLANKLINE>
Annotated sections
==================
...
<BLANKLINE>
[part1]
option= a1 a2
a3 a4
a5
    /sample-buildout/base.cfg
+=  /sample-buildout/extension1.cfg
+=  /sample-buildout/extension2.cfg
recipe=
    /sample-buildout/base.cfg
<BLANKLINE>
[part2]
option= b1 b2 b3 b4
    /sample-buildout/base.cfg
-=  /sample-buildout/extension1.cfg
-=  /sample-buildout/extension2.cfg
recipe=
    /sample-buildout/base.cfg
<BLANKLINE>
[part3]
option= c1 c2
c3 c4 c5
    /sample-buildout/base.cfg
+=  /sample-buildout/extension1.cfg
recipe=
    /sample-buildout/base.cfg
<BLANKLINE>
[part4]
option= d2
d3
d1
d4
    /sample-buildout/base.cfg
+=  /sample-buildout/extension1.cfg
-=  /sample-buildout/extension1.cfg
recipe=
    /sample-buildout/base.cfg
<BLANKLINE>
[part5]
option= h1 h2
    /sample-buildout/extension1.cfg
[versions]
zc.buildout= >=1.99
    DEFAULT_VALUE
zc.recipe.egg= >=1.99
    DEFAULT_VALUE
<BLANKLINE>

Cleanup.

>>> os.remove(os.path.join(sample_buildout, 'base.cfg'))
>>> os.remove(os.path.join(sample_buildout, 'extension1.cfg'))
>>> os.remove(os.path.join(sample_buildout, 'extension2.cfg'))

Multiple configuration files

A configuration file can “extend” another configuration file. Options are read from the other configuration file if they aren’t already defined by your configuration file.

The configuration files your file extends can extend other configuration files. The same file may be used more than once although, of course, cycles aren’t allowed.

To see how this works, we use an example:

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... extends = base.cfg
...
... [debug]
... op = buildout
... """)
>>> write(sample_buildout, 'base.cfg',
... """
... [buildout]
... develop = recipes
... parts = debug
...
... [debug]
... recipe = recipes:debug
... op = base
... """)
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Installing debug.
op buildout
recipe recipes:debug

The example is pretty trivial, but the pattern it illustrates is pretty common. In a more practical example, the base buildout might represent a product and the extending buildout might be a customization.

Here is a more elaborate example.

>>> other = tmpdir('other')
>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... extends = b1.cfg b2.cfg %(b3)s
...
... [debug]
... op = buildout
... """ % dict(b3=os.path.join(other, 'b3.cfg')))
>>> write(sample_buildout, 'b1.cfg',
... """
... [buildout]
... extends = base.cfg
...
... [debug]
... op1 = b1 1
... op2 = b1 2
... """)
>>> write(sample_buildout, 'b2.cfg',
... """
... [buildout]
... extends = base.cfg
...
... [debug]
... op2 = b2 2
... op3 = b2 3
... """)
>>> write(other, 'b3.cfg',
... """
... [buildout]
... extends = b3base.cfg
...
... [debug]
... op4 = b3 4
... """)
>>> write(other, 'b3base.cfg',
... """
... [debug]
... op5 = b3base 5
... """)
>>> write(sample_buildout, 'base.cfg',
... """
... [buildout]
... develop = recipes
... parts = debug
...
... [debug]
... recipe = recipes:debug
... name = base
... """)
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Uninstalling debug.
Installing debug.
name base
op buildout
op1 b1 1
op2 b2 2
op3 b2 3
op4 b3 4
op5 b3base 5
recipe recipes:debug

There are several things to note about this example:

  • We can name multiple files in an extends option.

  • We can reference files recursively.

  • Relative file names in extended options are interpreted relative to the directory containing the referencing configuration file.

Loading Configuration from URLs

Configuration files can be loaded from URLs. To see how this works, we’ll set up a web server with some configuration files.

>>> server_data = tmpdir('server_data')
>>> write(server_data, "r1.cfg",
... """
... [debug]
... op1 = r1 1
... op2 = r1 2
... """)
>>> write(server_data, "r2.cfg",
... """
... [buildout]
... extends = r1.cfg
...
... [debug]
... op2 = r2 2
... op3 = r2 3
... """)
>>> server_url = start_server(server_data)
>>> write('client.cfg',
... """
... [buildout]
... develop = recipes
... parts = debug
... extends = %(url)s/r2.cfg
...
... [debug]
... recipe = recipes:debug
... name = base
... """ % dict(url=server_url))
>>> print_(system(buildout+ ' -c client.cfg'), end='')
Develop: '/sample-buildout/recipes'
Uninstalling debug.
Installing debug.
name base
op1 r1 1
op2 r2 2
op3 r2 3
recipe recipes:debug

Here we specified a URL for the file we extended. The file we downloaded, itself referred to a file on the server using a relative URL reference. Relative references are interpreted relative to the base URL when they appear in configuration files loaded via URL.

We can also specify a URL as the configuration file to be used by a buildout.

>>> os.remove('client.cfg')
>>> write(server_data, 'remote.cfg',
... """
... [buildout]
... develop = recipes
... parts = debug
... extends = r2.cfg
...
... [debug]
... recipe = recipes:debug
... name = remote
... """)
>>> print_(system(buildout + ' -c ' + server_url + '/remote.cfg'), end='')
While:
  Initializing.
Error: Missing option: buildout:directory

Normally, the buildout directory defaults to directory containing a configuration file. This won’t work for configuration files loaded from URLs. In this case, the buildout directory would normally be defined on the command line:

>>> print_(system(buildout
...              + ' -c ' + server_url + '/remote.cfg'
...              + ' buildout:directory=' + sample_buildout
...              ), end='')
Develop: '/sample-buildout/recipes'
Uninstalling debug.
Installing debug.
name remote
op1 r1 1
op2 r2 2
op3 r2 3
recipe recipes:debug

User defaults

If the file $HOME/.buildout/default.cfg, exists, it is read before reading the configuration file. ($HOME is the value of the HOME environment variable. The ‘/’ is replaced by the operating system file delimiter.)

>>> old_home = os.environ['HOME']
>>> home = tmpdir('home')
>>> mkdir(home, '.buildout')
>>> write(home, '.buildout', 'default.cfg',
... """
... [debug]
... op1 = 1
... op7 = 7
... """)
>>> os.environ['HOME'] = home
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Uninstalling debug.
Installing debug.
name base
op buildout
op1 b1 1
op2 b2 2
op3 b2 3
op4 b3 4
op5 b3base 5
op7 7
recipe recipes:debug

A buildout command-line argument, -U, can be used to suppress reading user defaults:

>>> print_(system(buildout + ' -U'), end='')
Develop: '/sample-buildout/recipes'
Uninstalling debug.
Installing debug.
name base
op buildout
op1 b1 1
op2 b2 2
op3 b2 3
op4 b3 4
op5 b3base 5
recipe recipes:debug

If the environment variable BUILDOUT_HOME is non-empty, that is used to locate default.cfg instead of looking in ~/.buildout/. Let’s set up a configuration file in an alternate directory and verify that we get the appropriate set of defaults:

>>> alterhome = tmpdir('alterhome')
>>> write(alterhome, 'default.cfg',
... """
... [debug]
... op1 = 1'
... op7 = 7'
... op8 = eight!
... """)
>>> os.environ['BUILDOUT_HOME'] = alterhome
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Uninstalling debug.
Installing debug.
name base
op buildout
op1 b1 1
op2 b2 2
op3 b2 3
op4 b3 4
op5 b3base 5
op7 7'
op8 eight!
recipe recipes:debug

The -U argument still suppresses reading of the default.cfg file from BUILDOUT_HOME:

>>> print_(system(buildout + ' -U'), end='')
Develop: '/sample-buildout/recipes'
Uninstalling debug.
Installing debug.
name base
op buildout
op1 b1 1
op2 b2 2
op3 b2 3
op4 b3 4
op5 b3base 5
recipe recipes:debug
>>> os.environ['HOME'] = old_home
>>> del os.environ['BUILDOUT_HOME']

Log level

We can control the level of logging by specifying a log level in out configuration file. For example, so suppress info messages, we can set the logging level to WARNING

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... log-level = WARNING
... extends = b1.cfg b2.cfg
... """)
>>> print_(system(buildout), end='')
name base
op1 b1 1
op2 b2 2
op3 b2 3
recipe recipes:debug

Socket timeout

The timeout of the connections to egg and configuration servers can be configured in the buildout section. Its value is configured in seconds.

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... socket-timeout = 5
... develop = recipes
... parts = debug
...
... [debug]
... recipe = recipes:debug
... op = timeout
... """)
>>> print_(system(buildout), end='')
Setting socket time out to 5 seconds.
Develop: '/sample-buildout/recipes'
Uninstalling debug.
Installing debug.
op timeout
recipe recipes:debug

If the socket-timeout is not numeric, a warning is issued and the default timeout of the Python socket module is used.

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... socket-timeout = 5s
... develop = recipes
... parts = debug
...
... [debug]
... recipe = recipes:debug
... op = timeout
... """)
>>> print_(system(buildout), end='')
Default socket timeout is used !
Value in configuration is not numeric: [5s].
<BLANKLINE>
Develop: '/sample-buildout/recipes'
Updating debug.
op timeout
recipe recipes:debug

Uninstall recipes

As we’ve seen, when parts are installed, buildout keeps track of files and directories that they create. When the parts are uninstalled these files and directories are deleted.

Sometimes more clean up is needed. For example, a recipe might add a system service by calling chkconfig –add during installation. Later during uninstallation, chkconfig –del will need to be called to remove the system service.

In order to deal with these uninstallation issues, you can register uninstall recipes. Uninstall recipes are registered using the ‘zc.buildout.uninstall’ entry point. Parts specify uninstall recipes using the ‘uninstall’ option.

In comparison to regular recipes, uninstall recipes are much simpler. They are simply callable objects that accept the name of the part to be uninstalled and the part’s options dictionary. Uninstall recipes don’t have access to the part itself since it maybe not be able to be instantiated at uninstallation time.

Here’s a recipe that simulates installation of a system service, along with an uninstall recipe that simulates removing the service.

>>> write(sample_buildout, 'recipes', 'service.py',
... """
... import sys
... class Service:
...
...     def __init__(self, buildout, name, options):
...         self.buildout = buildout
...         self.name = name
...         self.options = options
...
...     def install(self):
...         sys.stdout.write("chkconfig --add %s\\n"
...                          % self.options['script'])
...         return ()
...
...     def update(self):
...         pass
...
...
... def uninstall_service(name, options):
...     sys.stdout.write("chkconfig --del %s\\n" % options['script'])
... """)

To use these recipes we must register them using entry points. Make sure to use the same name for the recipe and uninstall recipe. This is required to let buildout know which uninstall recipe goes with which recipe.

>>> write(sample_buildout, 'recipes', 'setup.py',
... """
... from setuptools import setup
... entry_points = (
... '''
... [zc.buildout]
... mkdir = mkdir:Mkdir
... debug = debug:Debug
... service = service:Service
...
... [zc.buildout.uninstall]
... service = service:uninstall_service
... ''')
... setup(name="recipes", entry_points=entry_points)
... """)

Here’s how these recipes could be used in a buildout:

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts = service
...
... [service]
... recipe = recipes:service
... script = /path/to/script
... """)

When the buildout is run the service will be installed

>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Uninstalling debug.
Installing service.
chkconfig --add /path/to/script

The service has been installed. If the buildout is run again with no changes, the service shouldn’t be changed.

>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Updating service.

Now we change the service part to trigger uninstallation and re-installation.

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts = service
...
... [service]
... recipe = recipes:service
... script = /path/to/a/different/script
... """)
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Uninstalling service.
Running uninstall recipe.
chkconfig --del /path/to/script
Installing service.
chkconfig --add /path/to/a/different/script

Now we remove the service part, and add another part.

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts = debug
...
... [debug]
... recipe = recipes:debug
... """)
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Uninstalling service.
Running uninstall recipe.
chkconfig --del /path/to/a/different/script
Installing debug.
recipe recipes:debug

Uninstall recipes don’t have to take care of removing all the files and directories created by the part. This is still done automatically, following the execution of the uninstall recipe. An upshot is that an uninstallation recipe can access files and directories created by a recipe before they are deleted.

For example, here’s an uninstallation recipe that simulates backing up a directory before it is deleted. It is designed to work with the mkdir recipe introduced earlier.

>>> write(sample_buildout, 'recipes', 'backup.py',
... """
... import os, sys
... def backup_directory(name, options):
...     path = options['path']
...     size = len(os.listdir(path))
...     sys.stdout.write("backing up directory %s of size %s\\n"
...                      % (path, size))
... """)

It must be registered with the zc.buildout.uninstall entry point. Notice how it is given the name ‘mkdir’ to associate it with the mkdir recipe.

>>> write(sample_buildout, 'recipes', 'setup.py',
... """
... from setuptools import setup
... entry_points = (
... '''
... [zc.buildout]
... mkdir = mkdir:Mkdir
... debug = debug:Debug
... service = service:Service
...
... [zc.buildout.uninstall]
... uninstall_service = service:uninstall_service
... mkdir = backup:backup_directory
... ''')
... setup(name="recipes", entry_points=entry_points)
... """)

Now we can use it with a mkdir part.

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts = dir debug
...
... [dir]
... recipe = recipes:mkdir
... path = my_directory
...
... [debug]
... recipe = recipes:debug
... """)

Run the buildout to install the part.

>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Uninstalling debug.
Installing dir.
dir: Creating directory my_directory
Installing debug.
recipe recipes:debug

Now we remove the part from the configuration file.

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts = debug
...
... [debug]
... recipe = recipes:debug
... """)

When the buildout is run the part is removed, and the uninstall recipe is run before the directory is deleted.

>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Uninstalling dir.
Running uninstall recipe.
backing up directory /sample-buildout/my_directory of size 0
Updating debug.
recipe recipes:debug

Now we will return the registration to normal for the benefit of the rest of the examples.

>>> write(sample_buildout, 'recipes', 'setup.py',
... """
... from setuptools import setup
... entry_points = (
... '''
... [zc.buildout]
... mkdir = mkdir:Mkdir
... debug = debug:Debug
... ''')
... setup(name="recipes", entry_points=entry_points)
... """)

Command-line usage

A number of arguments can be given on the buildout command line. The command usage is:

buildout [options and assignments] [command [command arguments]]

The following options are supported:

-h (or –help)

Print basic usage information. If this option is used, then all other options are ignored.

-c filename

The -c option can be used to specify a configuration file, rather than buildout.cfg in the current directory.

-t socket_timeout

Specify the socket timeout in seconds.

-v

Increment the verbosity by 10. The verbosity is used to adjust the logging level. The verbosity is subtracted from the numeric value of the log-level option specified in the configuration file.

-q

Decrement the verbosity by 10.

-U

Don’t read user-default configuration.

-o

Run in off-line mode. This is equivalent to the assignment buildout:offline=true.

-O

Run in non-off-line mode. This is equivalent to the assignment buildout:offline=false. This is the default buildout mode. The -O option would normally be used to override a true offline setting in a configuration file.

-n

Run in newest mode. This is equivalent to the assignment buildout:newest=true. With this setting, which is the default, buildout will try to find the newest versions of distributions available that satisfy its requirements.

-N

Run in non-newest mode. This is equivalent to the assignment buildout:newest=false. With this setting, buildout will not seek new distributions if installed distributions satisfy it’s requirements.

Assignments are of the form:

section_name:option_name=value

Or:

option_name=value

which is equivalent to:

buildout:option_name=value

Options and assignments can be given in any order.

Here’s an example:

>>> write(sample_buildout, 'other.cfg',
... """
... [buildout]
... develop = recipes
... parts = debug
... installed = .other.cfg
... log-level = WARNING
...
... [debug]
... name = other
... recipe = recipes:debug
... """)

Note that we used the installed buildout option to specify an alternate file to store information about installed parts.

>>> print_(system(buildout+' -c other.cfg debug:op1=foo -v'), end='')
Develop: '/sample-buildout/recipes'
Installing debug.
name other
op1 foo
recipe recipes:debug

Here we used the -c option to specify an alternate configuration file, and the -v option to increase the level of logging from the default, WARNING.

Options can also be combined in the usual Unix way, as in:

>>> print_(system(buildout+' -vcother.cfg debug:op1=foo'), end='')
Develop: '/sample-buildout/recipes'
Updating debug.
name other
op1 foo
recipe recipes:debug

Here we combined the -v and -c options with the configuration file name. Note that the -c option has to be last, because it takes an argument.

>>> os.remove(os.path.join(sample_buildout, 'other.cfg'))
>>> os.remove(os.path.join(sample_buildout, '.other.cfg'))

The most commonly used command is ‘install’ and it takes a list of parts to install. if any parts are specified, only those parts are installed. To illustrate this, we’ll update our configuration and run the buildout in the usual way:

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts = debug d1 d2 d3
...
... [d1]
... recipe = recipes:mkdir
... path = d1
...
... [d2]
... recipe = recipes:mkdir
... path = d2
...
... [d3]
... recipe = recipes:mkdir
... path = d3
...
... [debug]
... recipe = recipes:debug
... """)
>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Uninstalling debug.
Installing debug.
recipe recipes:debug
Installing d1.
d1: Creating directory d1
Installing d2.
d2: Creating directory d2
Installing d3.
d3: Creating directory d3
>>> ls(sample_buildout)
-  .installed.cfg
-  b1.cfg
-  b2.cfg
-  base.cfg
d  bin
-  buildout.cfg
d  d1
d  d2
d  d3
d  demo
d  develop-eggs
d  eggs
d  parts
d  recipes
>>> cat(sample_buildout, '.installed.cfg')
... # doctest: +NORMALIZE_WHITESPACE
[buildout]
installed_develop_eggs = /sample-buildout/develop-eggs/recipes.egg-link
parts = debug d1 d2 d3
<BLANKLINE>
[debug]
__buildout_installed__ =
__buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg==
recipe = recipes:debug
<BLANKLINE>
[d1]
__buildout_installed__ = /sample-buildout/d1
__buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg==
path = /sample-buildout/d1
recipe = recipes:mkdir
<BLANKLINE>
[d2]
__buildout_installed__ = /sample-buildout/d2
__buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg==
path = /sample-buildout/d2
recipe = recipes:mkdir
<BLANKLINE>
[d3]
__buildout_installed__ = /sample-buildout/d3
__buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg==
path = /sample-buildout/d3
recipe = recipes:mkdir

Now we’ll update our configuration file:

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts = debug d2 d3 d4
...
... [d2]
... recipe = recipes:mkdir
... path = data2
...
... [d3]
... recipe = recipes:mkdir
... path = data3
...
... [d4]
... recipe = recipes:mkdir
... path = ${d2:path}-extra
...
... [debug]
... recipe = recipes:debug
... x = 1
... """)

and run the buildout specifying just d3 and d4:

>>> print_(system(buildout+' install d3 d4'), end='')
Develop: '/sample-buildout/recipes'
Uninstalling d3.
Installing d3.
d3: Creating directory data3
Installing d4.
d4: Creating directory data2-extra
>>> ls(sample_buildout)
-  .installed.cfg
-  b1.cfg
-  b2.cfg
-  base.cfg
d  bin
-  buildout.cfg
d  d1
d  d2
d  data2-extra
d  data3
d  demo
d  develop-eggs
d  eggs
d  parts
d  recipes

Only the d3 and d4 recipes ran. d3 was removed and data3 and data2-extra were created.

The .installed.cfg is only updated for the recipes that ran:

>>> cat(sample_buildout, '.installed.cfg')
... # doctest: +NORMALIZE_WHITESPACE
[buildout]
installed_develop_eggs = /sample-buildout/develop-eggs/recipes.egg-link
parts = debug d1 d2 d3 d4
<BLANKLINE>
[debug]
__buildout_installed__ =
__buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg==
recipe = recipes:debug
<BLANKLINE>
[d1]
__buildout_installed__ = /sample-buildout/d1
__buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg==
path = /sample-buildout/d1
recipe = recipes:mkdir
<BLANKLINE>
[d2]
__buildout_installed__ = /sample-buildout/d2
__buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg==
path = /sample-buildout/d2
recipe = recipes:mkdir
<BLANKLINE>
[d3]
__buildout_installed__ = /sample-buildout/data3
__buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg==
path = /sample-buildout/data3
recipe = recipes:mkdir
<BLANKLINE>
[d4]
__buildout_installed__ = /sample-buildout/data2-extra
__buildout_signature__ = recipes-PiIFiO8ny5yNZ1S3JfT0xg==
path = /sample-buildout/data2-extra
recipe = recipes:mkdir

Note that the installed data for debug, d1, and d2 haven’t changed, because we didn’t install those parts and that the d1 and d2 directories are still there.

Now, if we run the buildout without the install command:

>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Uninstalling d2.
Uninstalling d1.
Uninstalling debug.
Installing debug.
recipe recipes:debug
x 1
Installing d2.
d2: Creating directory data2
Updating d3.
Updating d4.

We see the output of the debug recipe and that data2 was created. We also see that d1 and d2 have gone away:

>>> ls(sample_buildout)
-  .installed.cfg
-  b1.cfg
-  b2.cfg
-  base.cfg
d  bin
-  buildout.cfg
d  data2
d  data2-extra
d  data3
d  demo
d  develop-eggs
d  eggs
d  parts
d  recipes

Alternate directory and file locations

The buildout normally puts the bin, eggs, and parts directories in the directory in the directory containing the configuration file. You can provide alternate locations, and even names for these directories.

>>> alt = tmpdir('sample-alt')
>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts =
... develop-eggs-directory = %(developbasket)s
... eggs-directory = %(basket)s
... bin-directory = %(scripts)s
... parts-directory = %(work)s
... """ % dict(
...    developbasket = os.path.join(alt, 'developbasket'),
...    basket = os.path.join(alt, 'basket'),
...    scripts = os.path.join(alt, 'scripts'),
...    work = os.path.join(alt, 'work'),
... ))
>>> print_(system(buildout), end='')
Creating directory '/sample-alt/scripts'.
Creating directory '/sample-alt/work'.
Creating directory '/sample-alt/basket'.
Creating directory '/sample-alt/developbasket'.
Develop: '/sample-buildout/recipes'
Uninstalling d4.
Uninstalling d3.
Uninstalling d2.
Uninstalling debug.
>>> ls(alt)
d  basket
d  developbasket
d  scripts
d  work
>>> ls(alt, 'developbasket')
-  recipes.egg-link

You can also specify an alternate buildout directory:

>>> rmdir(alt)
>>> alt = tmpdir('sample-alt')
>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... directory = %(alt)s
... develop = %(recipes)s
... parts =
... """ % dict(
...    alt=alt,
...    recipes=os.path.join(sample_buildout, 'recipes'),
...    ))
>>> print_(system(buildout), end='')
Creating directory '/sample-alt/bin'.
Creating directory '/sample-alt/parts'.
Creating directory '/sample-alt/eggs'.
Creating directory '/sample-alt/develop-eggs'.
Develop: '/sample-buildout/recipes'
>>> ls(alt)
-  .installed.cfg
d  bin
d  develop-eggs
d  eggs
d  parts
>>> ls(alt, 'develop-eggs')
-  recipes.egg-link

Logging control

Three buildout options are used to control logging:

log-level

specifies the log level

verbosity

adjusts the log level

log-format

allows an alternate logging for mat to be specified

We’ve already seen the log level and verbosity. Let’s look at an example of changing the format:

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts =
... log-level = 25
... verbosity = 5
... log-format = %(levelname)s %(message)s
... """)

Here, we’ve changed the format to include the log-level name, rather than the logger name.

We’ve also illustrated, with a contrived example, that the log level can be a numeric value and that the verbosity can be specified in the configuration file. Because the verbosity is subtracted from the log level, we get a final log level of 20, which is the INFO level.

>>> print_(system(buildout), end='')
INFO Develop: '/sample-buildout/recipes'

Predefined buildout options

Buildouts have a number of predefined options that recipes can use and that users can override in their configuration files. To see these, we’ll run a minimal buildout configuration with a debug logging level. One of the features of debug logging is that the configuration database is shown.

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... parts =
... """)
>>> print_(system(buildout+' -vv'), end='') # doctest: +NORMALIZE_WHITESPACE
Installing 'zc.buildout', 'setuptools'.
We have a develop egg: zc.buildout 1.0.0.
We have the best distribution that satisfies 'setuptools'.
Picked: setuptools = 0.7
<BLANKLINE>
Configuration data:
[buildout]
allow-hosts = *
allow-picked-versions = true
bin-directory = /sample-buildout/bin
develop-eggs-directory = /sample-buildout/develop-eggs
directory = /sample-buildout
eggs-directory = /sample-buildout/eggs
executable = python
find-links =
install-from-cache = false
installed = /sample-buildout/.installed.cfg
log-format =
log-level = INFO
newest = true
offline = false
parts =
parts-directory = /sample-buildout/parts
prefer-final = true
python = buildout
show-picked-versions = false
socket-timeout =
update-versions-file =
use-dependency-links = true
verbosity = 20
versions = versions
[versions]
zc.buildout = >=1.99
zc.recipe.egg = >=1.99
<BLANKLINE>

All of these options can be overridden by configuration files or by command-line assignments. We’ve discussed most of these options already, but let’s review them and touch on some we haven’t discussed:

allow-hosts

On some environments the links visited by zc.buildout can be forbidden by paranoid firewalls. These URLs might be in the chain of links visited by zc.buildout as defined by buildout’s find-links option, or as defined by various eggs in their url, download_url, dependency_links metadata.

The fact that package_index works like a spider and might visit links and go to other locations makes this even harder.

The allow-hosts option provides a way to prevent this, and works exactly like the one provided in easy_install.

You can provide a list of allowed host, together with wildcards:

[buildout]
...

allow-hosts =
    *.python.org
    example.com

All URLs that does not match these hosts will not be visited.

allow-picked-versions

By default, the buildout will choose the best match for a given requirement if the requirement is not specified precisely (for instance, using the “versions” option. This behavior corresponds to the “allow-picked-versions” being set to its default value, “true”. If “allow-picked-versions” is “false,” instead of picking the best match, buildout will raise an error. This helps enforce repeatability.

bin-directory

The directory path where scripts are written. This can be a relative path, which is interpreted relative to the directory option.

develop-eggs-directory

The directory path where development egg links are created for software being created in the local project. This can be a relative path, which is interpreted relative to the directory option.

directory

The buildout directory. This is the base for other buildout file and directory locations, when relative locations are used.

eggs-directory

The directory path where downloaded eggs are put. It is common to share this directory across buildouts. Eggs in this directory should never be modified. This can be a relative path, which is interpreted relative to the directory option.

find-links

You can specify more locations to search for distributions using the find-links option. All locations specified will be searched for distributions along with the package index as described before.

Locations can be urls:

[buildout]
...
find-links = http://download.zope.org/distribution/

They can also be directories on disk:

[buildout]
...
find-links = /some/path

Finally, they can also be direct paths to distributions:

[buildout]
...
find-links = /some/path/someegg-1.0.0-py2.3.egg

Any number of locations can be specified in the find-links option:

[buildout]
...
find-links =
    http://download.zope.org/distribution/
    /some/otherpath
    /some/path/someegg-1.0.0-py2.3.egg
install-from-cache

A download cache can be used as the basis of application source releases. In an application source release, we want to distribute an application that can be built without making any network accesses. In this case, we distribute a buildout with download cache and tell the buildout to install from the download cache only, without making network accesses. The buildout install-from-cache option can be used to signal that packages should be installed only from the download cache.

installed

The file path where information about the results of the previous buildout run is written. This can be a relative path, which is interpreted relative to the directory option. This file provides an inventory of installed parts with information needed to decide which if any parts need to be uninstalled.

log-format

The format used for logging messages.

log-level

The log level before verbosity adjustment

newest

By default buildout and recipes will try to find the newest versions of distributions needed to satisfy requirements. This can be very time consuming, especially when incrementally working on setting up a buildout or working on a recipe. The buildout “newest” option can be used to to suppress this. If the “newest” option is set to false, then new distributions won’t be sought if an installed distribution meets requirements. The “newest” option can also be set to false using the -N command-line option. See also the “offline” option.

offline

The “offline” option goes a bit further than the “newest” option. If the buildout “offline” option is given a value of “true”, the buildout and recipes that are aware of the option will avoid doing network access. This is handy when running the buildout when not connected to the internet. It also makes buildouts run much faster. This option is typically set using the buildout -o option.

parts

A white space separated list of parts to be installed.

parts-directory

A working directory that parts can used to store data.

prefer-final

Currently, when searching for new releases, the newest available release is used. This isn’t usually ideal, as you may get a development release or alpha releases not ready to be widely used. You can request that final releases be preferred using the prefer final option in the buildout section:

[buildout]
...
prefer-final = true

When the prefer-final option is set to true, then when searching for new releases, final releases are preferred. If there are final releases that satisfy distribution requirements, then those releases are used even if newer non-final releases are available. The buildout prefer-final option can be used to override this behavior.

In buildout version 2, final releases will be preferred by default. You will then need to use a false value for prefer-final to get the newest releases.

use-dependency-links

By default buildout will obey the setuptools dependency_links metadata when it looks for dependencies. This behavior can be controlled with the use-dependency-links buildout option:

[buildout]
...
use-dependency-links = false

The option defaults to true. If you set it to false, then dependency links are only looked for in the locations specified by find-links.

verbosity

A log-level adjustment. Typically, this is set via the -q and -v command-line options.

Creating new buildouts and bootstrapping

If zc.buildout is installed, you can use it to create a new buildout with it’s own local copies of zc.buildout and setuptools and with local buildout scripts.

>>> sample_bootstrapped = tmpdir('sample-bootstrapped')
>>> print_(system(buildout
...              +' -c'+os.path.join(sample_bootstrapped, 'setup.cfg')
...              +' init'), end='')
Creating '/sample-bootstrapped/setup.cfg'.
Creating directory '/sample-bootstrapped/bin'.
Creating directory '/sample-bootstrapped/parts'.
Creating directory '/sample-bootstrapped/eggs'.
Creating directory '/sample-bootstrapped/develop-eggs'.
Generated script '/sample-bootstrapped/bin/buildout'.

Note that a basic setup.cfg was created for us. This is because we provided an ‘init’ argument. By default, the generated setup.cfg is as minimal as it could be:

>>> cat(sample_bootstrapped, 'setup.cfg')
[buildout]
parts =

We also get other buildout artifacts:

>>> ls(sample_bootstrapped)
d  bin
d  develop-eggs
d  eggs
d  parts
-  setup.cfg
>>> ls(sample_bootstrapped, 'bin')
-  buildout
>>> _ = (ls(sample_bootstrapped, 'eggs'),
...      ls(sample_bootstrapped, 'develop-eggs'))
-  setuptools-0.7-py2.3.egg
-  zc.buildout-1.0-py2.3.egg

(We list both the eggs and develop-eggs directories because the buildout or setuptools egg could be installed in the develop-eggs directory if the original buildout had develop eggs for either buildout or setuptools.)

Note that the buildout script was installed but not run. To run the buildout, we’d have to run the installed buildout script.

If we have an existing buildout that already has a buildout.cfg, we’ll normally use the bootstrap command instead of init. It will complain if there isn’t a configuration file:

>>> sample_bootstrapped2 = tmpdir('sample-bootstrapped2')
>>> print_(system(buildout
...              +' -c'+os.path.join(sample_bootstrapped2, 'setup.cfg')
...              +' bootstrap'), end='')
While:
  Initializing.
Error: Couldn't open /sample-bootstrapped2/setup.cfg
>>> write(sample_bootstrapped2, 'setup.cfg',
... """
... [buildout]
... parts =
... """)
>>> print_(system(buildout
...              +' -c'+os.path.join(sample_bootstrapped2, 'setup.cfg')
...              +' bootstrap'), end='')
Creating directory '/sample-bootstrapped2/bin'.
Creating directory '/sample-bootstrapped2/parts'.
Creating directory '/sample-bootstrapped2/eggs'.
Creating directory '/sample-bootstrapped2/develop-eggs'.
Generated script '/sample-bootstrapped2/bin/buildout'.

Similarly, if there is a configuration file and we use the init command, we’ll get an error that the configuration file already exists:

>>> print_(system(buildout
...              +' -c'+os.path.join(sample_bootstrapped, 'setup.cfg')
...              +' init'), end='')
While:
  Initializing.
Error: '/sample-bootstrapped/setup.cfg' already exists.

Initial eggs

When using the init command, you can specify distribution requirements or paths to use:

>>> cd(sample_bootstrapped)
>>> remove('setup.cfg')
>>> print_(system(buildout + ' -csetup.cfg init demo other ./src'), end='')
Creating '/sample-bootstrapped/setup.cfg'.
Getting distribution for 'zc.recipe.egg>=2.0.0a3'.
Got zc.recipe.egg
Installing py.
Getting distribution for 'demo'.
Got demo 0.3.
Getting distribution for 'other'.
Got other 1.0.
Getting distribution for 'demoneeded'.
Got demoneeded 1.1.
Generated script '/sample-bootstrapped/bin/demo'.
Generated script '/sample-bootstrapped/bin/distutilsscript'.
Generated interpreter '/sample-bootstrapped/bin/py'.

This causes a py part to be included that sets up a custom python interpreter with the given requirements or paths:

>>> cat('setup.cfg')
[buildout]
parts = py
<BLANKLINE>
[py]
recipe = zc.recipe.egg
interpreter = py
eggs =
  demo
  other
extra-paths =
  ./src

Passing requirements or paths causes the the buildout to be run as part of initialization. In the example above, we got a number of distributions installed and 2 scripts generated. The first, demo, was defined by the demo project. The second, py was defined by the generated configuration. It’s a “custom interpreter” that behaves like a standard Python interpreter, except that includes the specified eggs and extra paths in it’s Python path.

We specified a source directory that didn’t exist. Buildout created it for us:

>>> ls('.')
-  .installed.cfg
d  bin
d  develop-eggs
d  eggs
d  parts
-  setup.cfg
d  src
>>> uncd()

Finding distributions

By default, buildout searches the Python Package Index when looking for distributions. You can, instead, specify your own index to search using the index option:

[buildout]
...
index = http://index.example.com/

This index, or the default of http://pypi.python.org/simple/ if no index is specified, will always be searched for distributions unless running buildout with options that prevent searching for distributions. The latest version of the distribution that meets the requirements of the buildout will always be used.

You can also specify more locations to search for distributions using the find-links option. See its description above.

Controlling the installation database

The buildout installed option is used to specify the file used to save information on installed parts. This option is initialized to “.installed.cfg”, but it can be overridden in the configuration file or on the command line:

>>> write('buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts = debug
...
... [debug]
... recipe = recipes:debug
... """)
>>> print_(system(buildout+' buildout:installed=inst.cfg'), end='')
Develop: '/sample-buildout/recipes'
Installing debug.
recipe recipes:debug
>>> ls(sample_buildout)
-  b1.cfg
-  b2.cfg
-  base.cfg
d  bin
-  buildout.cfg
d  demo
d  develop-eggs
d  eggs
-  inst.cfg
d  parts
d  recipes

The installation database can be disabled by supplying an empty buildout installed option:

>>> os.remove('inst.cfg')
>>> print_(system(buildout+' buildout:installed='), end='')
Develop: '/sample-buildout/recipes'
Installing debug.
recipe recipes:debug
>>> ls(sample_buildout)
-  b1.cfg
-  b2.cfg
-  base.cfg
d  bin
-  buildout.cfg
d  demo
d  develop-eggs
d  eggs
d  parts
d  recipes

Note that there will be no installation database if there are no parts:

>>> write('buildout.cfg',
... """
... [buildout]
... parts =
... """)
>>> print_(system(buildout+' buildout:installed=inst.cfg'), end='')
>>> ls(sample_buildout)
-  b1.cfg
-  b2.cfg
-  base.cfg
d  bin
-  buildout.cfg
d  demo
d  develop-eggs
d  eggs
d  parts
d  recipes

Extensions

A feature allows code to be loaded and run after configuration files have been read but before the buildout has begun any processing. The intent is to allow special plugins such as urllib2 request handlers to be loaded.

To load an extension, we use the extensions option and list one or more distribution requirements, on separate lines. The distributions named will be loaded and any zc.buildout.extension entry points found will be called with the buildout as an argument. When buildout finishes processing, any zc.buildout.unloadextension entry points found will be called with the buildout as an argument.

Let’s create a sample extension in our sample buildout created in the previous section:

>>> mkdir(sample_bootstrapped, 'demo')
>>> write(sample_bootstrapped, 'demo', 'demo.py',
... """
... import sys
... def ext(buildout):
...     sys.stdout.write('%s %s\\n' % ('ext', sorted(buildout)))
... def unload(buildout):
...     sys.stdout.write('%s %s\\n' % ('unload', sorted(buildout)))
... """)
>>> write(sample_bootstrapped, 'demo', 'setup.py',
... """
... from setuptools import setup
...
... setup(
...     name = "demo",
...     entry_points = {
...        'zc.buildout.extension': ['ext = demo:ext'],
...        'zc.buildout.unloadextension': ['ext = demo:unload'],
...        },
...     )
... """)

Our extension just prints out the word ‘demo’, and lists the sections found in the buildout passed to it.

We’ll update our buildout.cfg to list the demo directory as a develop egg to be built:

>>> write(sample_bootstrapped, 'buildout.cfg',
... """
... [buildout]
... develop = demo
... parts =
... """)
>>> os.chdir(sample_bootstrapped)
>>> print_(system(os.path.join(sample_bootstrapped, 'bin', 'buildout')),
...        end='')
Develop: '/sample-bootstrapped/demo'

Now we can add the extensions option. We were a bit tricky and ran the buildout once with the demo develop egg defined but without the extension option. This is because extensions are loaded before the buildout creates develop eggs. We needed to use a separate buildout run to create the develop egg. Normally, when eggs are loaded from the network, we wouldn’t need to do anything special.

>>> write(sample_bootstrapped, 'buildout.cfg',
... """
... [buildout]
... develop = demo
... extensions = demo
... parts =
... """)

We see that our extension is loaded and executed:

>>> print_(system(os.path.join(sample_bootstrapped, 'bin', 'buildout')),
...        end='')
ext ['buildout', 'versions']
Develop: '/sample-bootstrapped/demo'
unload ['buildout', 'versions']

Repeatable buildouts: controlling eggs used

One of the goals of zc.buildout is to provide enough control to make buildouts repeatable. It should be possible to check the buildout configuration files for a project into a version control system and later use the checked in files to get the same buildout, subject to changes in the environment outside the buildout.

An advantage of using Python eggs is that dependencies of eggs used are automatically determined and used. The automatic inclusion of dependent distributions is at odds with the goal of repeatable buildouts.

To support repeatable buildouts, a versions section can be created with options for each distribution name who’s version is to be fixed. The section can then be specified via the buildout versions option.

To see how this works, we’ll create two versions of a recipe egg:

>>> mkdir('recipe')
>>> write('recipe', 'recipe.py',
... '''
... import sys
... print_ = lambda *a: sys.stdout.write(' '.join(map(str, a))+'\\n')
... class Recipe:
...     def __init__(*a): pass
...     def install(self):
...         print_('recipe v1')
...         return ()
...     update = install
... ''')
>>> write('recipe', 'setup.py',
... '''
... from setuptools import setup
... setup(name='spam', version='1', py_modules=['recipe'],
...       entry_points={'zc.buildout': ['default = recipe:Recipe']},
...       )
... ''')
>>> write('recipe', 'README', '')
>>> print_(system(buildout+' setup recipe bdist_egg')) # doctest: +ELLIPSIS
Running setup script 'recipe/setup.py'.
...
>>> rmdir('recipe', 'build')
>>> write('recipe', 'recipe.py',
... '''
... import sys
... print_ = lambda *a: sys.stdout.write(' '.join(map(str, a))+'\\n')
... class Recipe:
...     def __init__(*a): pass
...     def install(self):
...         print_('recipe v2')
...         return ()
...     update = install
... ''')
>>> write('recipe', 'setup.py',
... '''
... from setuptools import setup
... setup(name='spam', version='2', py_modules=['recipe'],
...       entry_points={'zc.buildout': ['default = recipe:Recipe']},
...       )
... ''')
>>> print_(system(buildout+' setup recipe bdist_egg')) # doctest: +ELLIPSIS
Running setup script 'recipe/setup.py'.
...

and we’ll configure a buildout to use it:

>>> write('buildout.cfg',
... '''
... [buildout]
... parts = foo
... find-links = %s
...
... [foo]
... recipe = spam
... ''' % join('recipe', 'dist'))

If we run the buildout, it will use version 2:

>>> print_(system(buildout), end='')
Getting distribution for 'spam'.
Got spam 2.
Installing foo.
recipe v2

We can specify a versions section that lists our recipe and name it in the buildout section:

>>> write('buildout.cfg',
... '''
... [buildout]
... parts = foo
... find-links = %s
...
... [versions]
... spam = 1
... eggs = 2.2
...
... [foo]
... recipe = spam
... ''' % join('recipe', 'dist'))

Here we created a versions section listing the version 1 for the spam distribution. We told the buildout to use it by specifying release-1 as in the versions option.

Now, if we run the buildout, we’ll use version 1 of the spam recipe:

>>> print_(system(buildout), end='')
Getting distribution for 'spam==1'.
Got spam 1.
Uninstalling foo.
Installing foo.
recipe v1

Running the buildout in verbose mode will help us get information about versions used. If we run the buildout in verbose mode without specifying a versions section:

>>> print_(system(buildout+' buildout:versions= -v'), end='')
Installing 'zc.buildout', 'setuptools'.
We have a develop egg: zc.buildout 1.0.0.
We have the best distribution that satisfies 'setuptools'.
Picked: setuptools = 0.6
Installing 'spam'.
We have the best distribution that satisfies 'spam'.
Picked: spam = 2.
Uninstalling foo.
Installing foo.
recipe v2

We’ll get output that includes lines that tell us what versions buildout chose a for us, like:

zc.buildout.easy_install.picked: spam = 2

This allows us to discover versions that are picked dynamically, so that we can fix them in a versions section.

If we run the buildout with the versions section:

>>> print_(system(buildout+' -v'), end='')
Installing 'zc.buildout', 'setuptools'.
We have a develop egg: zc.buildout 1.0.0.
We have the best distribution that satisfies 'setuptools'.
Picked: setuptools = 0.6
Installing 'spam'.
We have the distribution that satisfies 'spam==1'.
Uninstalling foo.
Installing foo.
recipe v1

We won’t get output for the spam distribution, which we didn’t pick, but we will get output for setuptools, which we didn’t specify versions for.

You can request buildout to generate an error if it picks any versions:

>>> write('buildout.cfg',
... '''
... [buildout]
... parts = foo
... find-links = %s
... allow-picked-versions = false
...
... [versions]
... spam = 1
... eggs = 2.2
...
... [foo]
... recipe = spam
... ''' % join('recipe', 'dist'))
>>> print_(system(buildout), end='') # doctest: +ELLIPSIS
While:
  Installing.
  Checking for upgrades.
  Getting distribution for 'setuptools'.
Error: Picked: setuptools = 0.6.30

We can name a version something else, if we wish, using the versions option:

>>> write('buildout.cfg',
... '''
... [buildout]
... parts = foo
... find-links = %s
... versions = release1
...
... [release1]
... spam = 1
... eggs = 2.2
...
... [foo]
... recipe = spam
... ''' % join('recipe', 'dist'))
>>> print_(system(buildout), end='') # doctest: +ELLIPSIS
Updating foo.
recipe v1

We can also disable checking versions:

>>> write('buildout.cfg',
... '''
... [buildout]
... parts = foo
... find-links = %s
... versions =
...
... [versions]
... spam = 1
... eggs = 2.2
...
... [foo]
... recipe = spam
... ''' % join('recipe', 'dist'))
>>> print_(system(buildout), end='') # doctest: +ELLIPSIS
Uninstalling foo.
Installing foo.
recipe v2

Easier reporting and managing of versions (new in buildout 2.0)

Since buildout 2.0, the functionality of the buildout-versions extension is part of buildout itself. This makes reporting and managing versions easier.

Buildout picks a version for setuptools and for the tests, we need to grab the version number:

>>> import pkg_resources
>>> req = pkg_resources.Requirement.parse('setuptools')
>>> setuptools_version = pkg_resources.working_set.find(req).version

If you set the show-picked-versions option, buildout will print versions it picked at the end of its run:

>>> write('buildout.cfg',
... '''
... [buildout]
... parts = foo
... find-links = %s
... show-picked-versions = true
...
... [versions]
...
... [foo]
... recipe = spam
... ''' % join('recipe', 'dist'))
>>> print_(system(buildout), end='') # doctest: +ELLIPSIS
Updating foo.
recipe v2
Versions had to be automatically picked.
The following part definition lists the versions picked:
[versions]
setuptools = 0.6.99
spam = 2

When everything is pinned, no output is generated:

>>> write('buildout.cfg',
... '''
... [buildout]
... parts = foo
... find-links = %s
... show-picked-versions = true
...
... [versions]
... setuptools = %s
... spam = 2
...
... [foo]
... recipe = spam
... ''' % (join('recipe', 'dist'), setuptools_version))
>>> print_(system(buildout), end='') # doctest: +ELLIPSIS
Updating foo.
recipe v2

The Python package index is case-insensitive. Both http://pypi.python.org/simple/Django/ and http://pypi.python.org/simple/dJaNgO/ work. And distributions aren’t always naming themselves consistently case-wise. So all version names are normalized and case differences won’t impact the pinning:

>>> write('buildout.cfg',
... '''
... [buildout]
... parts = foo
... find-links = %s
... show-picked-versions = true
...
... [versions]
... setuptools = %s
... Spam = 2
...
... [foo]
... recipe = spam
... ''' % (join('recipe', 'dist'), setuptools_version))
>>> print_(system(buildout), end='') # doctest: +ELLIPSIS
Updating foo.
recipe v2

Sometimes it is handy to have a separate file with versions. This is a regular buildout file with a single [versions] section. You include it by extending from that versions file:

>>> write('my_versions.cfg',
... '''
... [versions]
... setuptools = %s
... spam = 2
... ''' % setuptools_version)
>>> write('buildout.cfg',
... '''
... [buildout]
... parts = foo
... extends = my_versions.cfg
... find-links = %s
... show-picked-versions = true
...
... [foo]
... recipe = spam
... ''' % join('recipe', 'dist'))
>>> print_(system(buildout), end='') # doctest: +ELLIPSIS
Updating foo.
recipe v2

If not everything is pinned and buildout has to pick versions, you can tell buildout to append the versions to your versions file. It simply appends them at the end.

>>> write('my_versions.cfg',
... '''
... [versions]
... setuptools = %s
... ''' % setuptools_version)
>>> write('buildout.cfg',
... '''
... [buildout]
... parts = foo
... extends = my_versions.cfg
... update-versions-file = my_versions.cfg
... find-links = %s
... show-picked-versions = true
...
... [foo]
... recipe = spam
... ''' % join('recipe', 'dist'))
>>> print_(system(buildout), end='') # doctest: +ELLIPSIS
Updating foo.
recipe v2
Versions had to be automatically picked.
The following part definition lists the versions picked:
[versions]
spam = 2
Picked versions have been written to my_versions.cfg

The versions file now contains the extra pin:

>>> print_(open('my_versions.cfg').read()) # doctest: +ELLIPSIS
<BLANKLINE>
...
<BLANKLINE>
# Added by buildout at YYYY-MM-DD hh:mm:ss.dddddd
spam = 2
<BLANKLINE>

And re-running buildout doesn’t report any picked versions anymore:

>>> print_(system(buildout), end='') # doctest: +ELLIPSIS
Updating foo.
recipe v2

If you’ve enabled update-versions-file but not show-picked-versions, buildout will append the versions to your versions file anyway (without printing them to the console):

>>> write('my_versions.cfg',
... '''
... [versions]
... setuptools = %s
... ''' % setuptools_version)
>>> write('buildout.cfg',
... '''
... [buildout]
... parts = foo
... extends = my_versions.cfg
... update-versions-file = my_versions.cfg
... find-links = %s
... show-picked-versions = false
...
... [foo]
... recipe = spam
... ''' % join('recipe', 'dist'))
>>> print_(system(buildout), end='') # doctest: +ELLIPSIS
Updating foo.
recipe v2
Picked versions have been written to my_versions.cfg

The versions file contains the extra pin:

>>> print_(open('my_versions.cfg').read()) # doctest: +ELLIPSIS
<BLANKLINE>
...
<BLANKLINE>
# Added by buildout at YYYY-MM-DD hh:mm:ss.dddddd
spam = 2
<BLANKLINE>

Because buildout now includes buildout-versions’ (and part of the older buildout.dumppickedversions’) functionality, it warns if these extensions are configured.

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... parts = foo
... extensions = buildout-versions
...
... [foo]
... recipe = spam
... """)
>>> print_(system(buildout), end='') # doctest: +NORMALIZE_WHITESPACE
While:
  Installing.
  Loading extensions.
  Error: Buildout now includes 'buildout-versions'
  (and part of the older 'buildout.dumppickedversions').
  Remove the extension from your configuration and look at the
  'show-picked-versions' option in buildout's documentation.

Using the download utility

The zc.buildout.download module provides a download utility that handles the details of downloading files needed for a buildout run from the internet. It downloads files to the local file system, using the download cache if desired and optionally checking the downloaded files’ MD5 checksum.

We setup an HTTP server that provides a file we want to download:

>>> server_data = tmpdir('sample_files')
>>> write(server_data, 'foo.txt', 'This is a foo text.')
>>> server_url = start_server(server_data)

We also use a fresh directory for temporary files in order to make sure that all temporary files have been cleaned up in the end:

>>> import tempfile
>>> old_tempdir = tempfile.tempdir
>>> tempfile.tempdir = tmpdir('tmp')

Downloading without using the cache

If no download cache should be used, the download utility is instantiated without any arguments:

>>> from zc.buildout.download import Download
>>> download = Download()
>>> print_(download.cache_dir)
None

Downloading a file is achieved by calling the utility with the URL as an argument. A tuple is returned that consists of the path to the downloaded copy of the file and a boolean value indicating whether this is a temporary file meant to be cleaned up during the same buildout run:

>>> path, is_temp = download(server_url+'foo.txt')
>>> print_(path)
/.../buildout-...
>>> cat(path)
This is a foo text.

As we aren’t using the download cache and haven’t specified a target path either, the download has ended up in a temporary file:

>>> is_temp
True
>>> import tempfile
>>> path.startswith(tempfile.gettempdir())
True

We are responsible for cleaning up temporary files behind us:

>>> remove(path)

When trying to access a file that doesn’t exist, we’ll get an exception:

>>> try: download(server_url+'not-there') # doctest: +ELLIPSIS
... except: print_('download error')
... else: print_('woops')
download error

Downloading a local file doesn’t produce a temporary file but simply returns the local file itself:

>>> download(join(server_data, 'foo.txt'))
('/sample_files/foo.txt', False)

We can also have the downloaded file’s MD5 sum checked:

>>> try: from hashlib import md5
... except ImportError: from md5 import new as md5
>>> path, is_temp = download(server_url+'foo.txt',
...                          md5('This is a foo text.'.encode()).hexdigest())
>>> is_temp
True
>>> remove(path)
>>> download(server_url+'foo.txt',
...          md5('The wrong text.'.encode()).hexdigest())
Traceback (most recent call last):
ChecksumError: MD5 checksum mismatch downloading 'http://localhost/foo.txt'

The error message in the event of an MD5 checksum mismatch for a local file reads somewhat differently:

>>> download(join(server_data, 'foo.txt'),
...               md5('This is a foo text.'.encode()).hexdigest())
('/sample_files/foo.txt', False)
>>> download(join(server_data, 'foo.txt'),
...          md5('The wrong text.'.encode()).hexdigest())
Traceback (most recent call last):
ChecksumError: MD5 checksum mismatch for local resource at '/sample_files/foo.txt'.

Finally, we can download the file to a specified place in the file system:

>>> target_dir = tmpdir('download-target')
>>> path, is_temp = download(server_url+'foo.txt',
...                          path=join(target_dir, 'downloaded.txt'))
>>> print_(path)
/download-target/downloaded.txt
>>> cat(path)
This is a foo text.
>>> is_temp
False

Trying to download a file in offline mode will result in an error:

>>> download = Download(cache=None, offline=True)
>>> download(server_url+'foo.txt')
Traceback (most recent call last):
UserError: Couldn't download 'http://localhost/foo.txt' in offline mode.

As an exception to this rule, file system paths and URLs in the file scheme will still work:

>>> cat(download(join(server_data, 'foo.txt'))[0])
This is a foo text.
>>> cat(download('file:' + join(server_data, 'foo.txt'))[0])
This is a foo text.
>>> remove(path)

Downloading using the download cache

In order to make use of the download cache, we need to configure the download utility differently. To do this, we pass a directory path as the cache attribute upon instantiation:

>>> cache = tmpdir('download-cache')
>>> download = Download(cache=cache)
>>> print_(download.cache_dir)
/download-cache/
Simple usage

When using the cache, a file will be stored in the cache directory when it is first downloaded. The file system path returned by the download utility points to the cached copy:

>>> ls(cache)
>>> path, is_temp = download(server_url+'foo.txt')
>>> print_(path)
/download-cache/foo.txt
>>> cat(path)
This is a foo text.
>>> is_temp
False

Whenever the file is downloaded again, the cached copy is used. Let’s change the file on the server to see this:

>>> write(server_data, 'foo.txt', 'The wrong text.')
>>> path, is_temp = download(server_url+'foo.txt')
>>> print_(path)
/download-cache/foo.txt
>>> cat(path)
This is a foo text.

If we specify an MD5 checksum for a file that is already in the cache, the cached copy’s checksum will be verified:

>>> download(server_url+'foo.txt', md5('The wrong text.'.encode()).hexdigest())
Traceback (most recent call last):
ChecksumError: MD5 checksum mismatch for cached download
               from 'http://localhost/foo.txt' at '/download-cache/foo.txt'

Trying to access another file at a different URL which has the same base name will result in the cached copy being used:

>>> mkdir(server_data, 'other')
>>> write(server_data, 'other', 'foo.txt', 'The wrong text.')
>>> path, is_temp = download(server_url+'other/foo.txt')
>>> print_(path)
/download-cache/foo.txt
>>> cat(path)
This is a foo text.

Given a target path for the download, the utility will provide a copy of the file at that location both when first downloading the file and when using a cached copy:

>>> remove(cache, 'foo.txt')
>>> ls(cache)
>>> write(server_data, 'foo.txt', 'This is a foo text.')
>>> path, is_temp = download(server_url+'foo.txt',
...                          path=join(target_dir, 'downloaded.txt'))
>>> print_(path)
/download-target/downloaded.txt
>>> cat(path)
This is a foo text.
>>> is_temp
False
>>> ls(cache)
- foo.txt
>>> remove(path)
>>> write(server_data, 'foo.txt', 'The wrong text.')
>>> path, is_temp = download(server_url+'foo.txt',
...                          path=join(target_dir, 'downloaded.txt'))
>>> print_(path)
/download-target/downloaded.txt
>>> cat(path)
This is a foo text.
>>> is_temp
False

In offline mode, downloads from any URL will be successful if the file is found in the cache:

>>> download = Download(cache=cache, offline=True)
>>> cat(download(server_url+'foo.txt')[0])
This is a foo text.

Local resources will be cached just like any others since download caches are sometimes used to create source distributions:

>>> remove(cache, 'foo.txt')
>>> ls(cache)
>>> write(server_data, 'foo.txt', 'This is a foo text.')
>>> download = Download(cache=cache)
>>> cat(download('file:' + join(server_data, 'foo.txt'), path=path)[0])
This is a foo text.
>>> ls(cache)
- foo.txt
>>> remove(cache, 'foo.txt')
>>> cat(download(join(server_data, 'foo.txt'), path=path)[0])
This is a foo text.
>>> ls(cache)
- foo.txt
>>> remove(cache, 'foo.txt')

However, resources with checksum mismatches will not be copied to the cache:

>>> download(server_url+'foo.txt', md5('The wrong text.'.encode()).hexdigest())
Traceback (most recent call last):
ChecksumError: MD5 checksum mismatch downloading 'http://localhost/foo.txt'
>>> ls(cache)
>>> remove(path)

If the file is completely missing it should notify the user of the error:

>>> download(server_url+'bar.txt') # doctest: +NORMALIZE_WHITESPACE +ELLIPSIS
Traceback (most recent call last):
...
UserError: Error downloading extends for URL http://localhost/bar.txt:
...404...
>>> ls(cache)

Finally, let’s see what happens if the download cache to be used doesn’t exist as a directory in the file system yet:

>>> Download(cache=join(cache, 'non-existent'))(server_url+'foo.txt')
Traceback (most recent call last):
UserError: The directory:
'/download-cache/non-existent'
to be used as a download cache doesn't exist.
Using namespace sub-directories of the download cache

It is common to store cached copies of downloaded files within sub-directories of the download cache to keep some degree of order. For example, zc.buildout stores downloaded distributions in a sub-directory named “dist”. Those sub-directories are also known as namespaces. So far, we haven’t specified any namespaces to use, so the download utility stored files directly inside the download cache. Let’s use a namespace “test” instead:

>>> download = Download(cache=cache, namespace='test')
>>> print_(download.cache_dir)
/download-cache/test

The namespace sub-directory hasn’t been created yet:

>>> ls(cache)

Downloading a file now creates the namespace sub-directory and places a copy of the file inside it:

>>> path, is_temp = download(server_url+'foo.txt')
>>> print_(path)
/download-cache/test/foo.txt
>>> ls(cache)
d test
>>> ls(cache, 'test')
- foo.txt
>>> cat(path)
This is a foo text.
>>> is_temp
False

The next time we want to download that file, the copy from inside the cache namespace is used. To see this clearly, we put a file with the same name but different content both on the server and in the cache’s root directory:

>>> write(server_data, 'foo.txt', 'The wrong text.')
>>> write(cache, 'foo.txt', 'The wrong text.')
>>> path, is_temp = download(server_url+'foo.txt')
>>> print_(path)
/download-cache/test/foo.txt
>>> cat(path)
This is a foo text.
>>> rmdir(cache, 'test')
>>> remove(cache, 'foo.txt')
>>> write(server_data, 'foo.txt', 'This is a foo text.')
Using a hash of the URL as the filename in the cache

So far, the base name of the downloaded file read from the URL has been used for the name of the cached copy of the file. This may not be desirable in some cases, for example when downloading files from different locations that have the same base name due to some naming convention, or if the file content depends on URL parameters. In such cases, an MD5 hash of the complete URL may be used as the filename in the cache:

>>> download = Download(cache=cache, hash_name=True)
>>> path, is_temp = download(server_url+'foo.txt')
>>> print_(path)
/download-cache/09f5793fcdc1716727f72d49519c688d
>>> cat(path)
This is a foo text.
>>> ls(cache)
- 09f5793fcdc1716727f72d49519c688d

The path was printed just to illustrate matters; we cannot know the real checksum since we don’t know which port the server happens to listen at when the test is run, so we don’t actually know the full URL of the file. Let’s check that the checksum actually belongs to the particular URL used:

>>> (path.lower() ==
...  join(cache, md5((server_url+'foo.txt').encode()).hexdigest()).lower())
True

The cached copy is used when downloading the file again:

>>> write(server_data, 'foo.txt', 'The wrong text.')
>>> (path, is_temp) == download(server_url+'foo.txt')
True
>>> cat(path)
This is a foo text.
>>> ls(cache)
- 09f5793fcdc1716727f72d49519c688d

If we change the URL, even in such a way that it keeps the base name of the file the same, the file will be downloaded again this time and put in the cache under a different name:

>>> path2, is_temp = download(server_url+'other/foo.txt')
>>> print_(path2)
/download-cache/537b6d73267f8f4447586989af8c470e
>>> path == path2
False
>>> (path2.lower() ==
...  join(cache, md5((server_url+'other/foo.txt').encode()).hexdigest()
...       ).lower())
True
>>> cat(path)
This is a foo text.
>>> cat(path2)
The wrong text.
>>> ls(cache)
- 09f5793fcdc1716727f72d49519c688d
- 537b6d73267f8f4447586989af8c470e
>>> remove(path)
>>> remove(path2)
>>> write(server_data, 'foo.txt', 'This is a foo text.')

Using the cache purely as a fall-back

Sometimes it is desirable to try downloading a file from the net if at all possible, and use the cache purely as a fall-back option when a server is down or if we are in offline mode. This mode is only in effect if a download cache is configured in the first place:

>>> download = Download(cache=cache, fallback=True)
>>> print_(download.cache_dir)
/download-cache/

A downloaded file will be cached:

>>> ls(cache)
>>> path, is_temp = download(server_url+'foo.txt')
>>> ls(cache)
- foo.txt
>>> cat(cache, 'foo.txt')
This is a foo text.
>>> is_temp
False

If the file cannot be served, the cached copy will be used:

>>> remove(server_data, 'foo.txt')
>>> try: Download()(server_url+'foo.txt') # doctest: +ELLIPSIS
... except: print_('download error')
... else: print_('woops')
download error
>>> path, is_temp = download(server_url+'foo.txt')
>>> cat(path)
This is a foo text.
>>> is_temp
False

Similarly, if the file is served but we’re in offline mode, we’ll fall back to using the cache:

>>> write(server_data, 'foo.txt', 'The wrong text.')
>>> get(server_url+'foo.txt')
'The wrong text.'
>>> offline_download = Download(cache=cache, offline=True, fallback=True)
>>> path, is_temp = offline_download(server_url+'foo.txt')
>>> print_(path)
/download-cache/foo.txt
>>> cat(path)
This is a foo text.
>>> is_temp
False

However, when downloading the file normally with the cache being used in fall-back mode, the file will be downloaded from the net and the cached copy will be replaced with the new content:

>>> cat(download(server_url+'foo.txt')[0])
The wrong text.
>>> cat(cache, 'foo.txt')
The wrong text.

When trying to download a resource whose checksum does not match, the cached copy will neither be used nor overwritten:

>>> write(server_data, 'foo.txt', 'This is a foo text.')
>>> download(server_url+'foo.txt', md5('The wrong text.'.encode()).hexdigest())
Traceback (most recent call last):
ChecksumError: MD5 checksum mismatch downloading 'http://localhost/foo.txt'
>>> cat(cache, 'foo.txt')
The wrong text.

Configuring the download utility from buildout options

The configuration options explained so far derive from the build logic implemented by the calling code. Other options configure the download utility for use in a particular project or buildout run; they are read from the buildout configuration section. The latter can be passed directly as the first argument to the download utility’s constructor.

The location of the download cache is specified by the download-cache option:

>>> download = Download({'download-cache': cache}, namespace='cmmi')
>>> print_(download.cache_dir)
/download-cache/cmmi

If the download-cache option specifies a relative path, it is understood relative to the current working directory, or to the buildout directory if that is given:

>>> download = Download({'download-cache': 'relative-cache'})
>>> print_(download.cache_dir)
/sample-buildout/relative-cache/
>>> download = Download({'directory': join(sample_buildout, 'root'),
...                      'download-cache': 'relative-cache'})
>>> print_(download.cache_dir)
/sample-buildout/root/relative-cache/

Keyword parameters take precedence over the corresponding options:

>>> download = Download({'download-cache': cache}, cache=None)
>>> print_(download.cache_dir)
None

Whether to assume offline mode can be inferred from either the offline or the install-from-cache option. As usual with zc.buildout, these options must assume one of the values ‘true’ and ‘false’:

>>> download = Download({'offline': 'true'})
>>> download.offline
True
>>> download = Download({'offline': 'false'})
>>> download.offline
False
>>> download = Download({'install-from-cache': 'true'})
>>> download.offline
True
>>> download = Download({'install-from-cache': 'false'})
>>> download.offline
False

These two options are combined using logical ‘or’:

>>> download = Download({'offline': 'true', 'install-from-cache': 'false'})
>>> download.offline
True
>>> download = Download({'offline': 'false', 'install-from-cache': 'true'})
>>> download.offline
True

The offline keyword parameter takes precedence over both the offline and install-from-cache options:

>>> download = Download({'offline': 'true'}, offline=False)
>>> download.offline
False
>>> download = Download({'install-from-cache': 'false'}, offline=True)
>>> download.offline
True

Regressions

MD5 checksum calculation needs to be reliable on all supported systems, which requires text files to be treated as binary to avoid implicit line-ending conversions:

>>> text = 'First line of text.\r\nSecond line.\r\n'
>>> f = open(join(server_data, 'foo.txt'), 'wb')
>>> _ = f.write(text.encode())
>>> f.close()
>>> path, is_temp = Download()(server_url+'foo.txt',
...                            md5(text.encode()).hexdigest())
>>> remove(path)

When “downloading” a directory given by file-system path or file: URL and using a download cache at the same time, the cached directory wasn’t handled correctly. Consequently, the cache was defeated and an attempt to cache the directory a second time broke. This is how it should work:

>>> download = Download(cache=cache)
>>> dirpath = join(server_data, 'some_directory')
>>> mkdir(dirpath)
>>> dest, _ = download(dirpath)

If we now modify the source tree, the second download will produce the original one from the cache:

>>> mkdir(join(dirpath, 'foo'))
>>> ls(dirpath)
d foo
>>> dest, _ = download(dirpath)
>>> ls(dest)

Clean up

We should have cleaned up all temporary files created by downloading things:

>>> ls(tempfile.tempdir)

Reset the global temporary directory:

>>> tempfile.tempdir = old_tempdir

Using a download cache

Normally, when distributions are installed, if any processing is needed, they are downloaded from the internet to a temporary directory and then installed from there. A download cache can be used to avoid the download step. This can be useful to reduce network access and to create source distributions of an entire buildout.

The buildout download-cache option can be used to specify a directory to be used as a download cache.

In this example, we’ll create a directory to hold the cache:

>>> cache = tmpdir('cache')

And set up a buildout that downloads some eggs:

>>> write('buildout.cfg',
... '''
... [buildout]
... parts = eggs
... download-cache = %(cache)s
... find-links = %(link_server)s
...
... [eggs]
... recipe = zc.recipe.egg
... eggs = demo ==0.2
... ''' % globals())

We specified a link server that has some distributions available for download:

>>> print_(get(link_server), end='')
<html><body>
<a href="bigdemo-0.1-py2.4.egg">bigdemo-0.1-py2.4.egg</a><br>
<a href="demo-0.1-py2.4.egg">demo-0.1-py2.4.egg</a><br>
<a href="demo-0.2-py2.4.egg">demo-0.2-py2.4.egg</a><br>
<a href="demo-0.3-py2.4.egg">demo-0.3-py2.4.egg</a><br>
<a href="demo-0.4c1-py2.4.egg">demo-0.4c1-py2.4.egg</a><br>
<a href="demoneeded-1.0.zip">demoneeded-1.0.zip</a><br>
<a href="demoneeded-1.1.zip">demoneeded-1.1.zip</a><br>
<a href="demoneeded-1.2c1.zip">demoneeded-1.2c1.zip</a><br>
<a href="du_zipped-1.0-pyN.N.egg">du_zipped-1.0-pyN.N.egg</a><br>
<a href="extdemo-1.4.zip">extdemo-1.4.zip</a><br>
<a href="index/">index/</a><br>
<a href="other-1.0-py2.4.egg">other-1.0-py2.4.egg</a><br>
</body></html>

We’ll enable logging on the link server so we can see what’s going on:

>>> _ = get(link_server+'enable_server_logging')
GET 200 /enable_server_logging

We also specified a download cache.

If we run the buildout, we’ll see the eggs installed from the link server as usual:

>>> print_(system(buildout), end='')
GET 200 /
GET 200 /demo-0.2-py2.4.egg
GET 200 /demoneeded-1.1.zip
Installing eggs.
Getting distribution for 'demo==0.2'.
Got demo 0.2.
Getting distribution for 'demoneeded'.
Got demoneeded 1.1.
Generated script '/sample-buildout/bin/demo'.

We’ll also get the download cache populated. The buildout doesn’t put files in the cache directly. It creates an intermediate directory, dist:

>>> ls(cache)
d  dist
>>> ls(cache, 'dist')
-  demo-0.2-py2.4.egg
-  demoneeded-1.1.zip

If we remove the installed eggs from eggs directory and re-run the buildout:

>>> import os
>>> for  f in os.listdir('eggs'):
...     if f.startswith('demo'):
...         remove('eggs', f)
>>> print_(system(buildout), end='')
GET 200 /
Updating eggs.
Getting distribution for 'demo==0.2'.
Got demo 0.2.
Getting distribution for 'demoneeded'.
Got demoneeded 1.1.

We see that the distributions aren’t downloaded, because they’re downloaded from the cache.

Installing solely from a download cache

A download cache can be used as the basis of application source releases. In an application source release, we want to distribute an application that can be built without making any network accesses. In this case, we distribute a buildout with download cache and tell the buildout to install from the download cache only, without making network accesses. The buildout install-from-cache option can be used to signal that packages should be installed only from the download cache.

Let’s remove our installed eggs and run the buildout with the install-from-cache option set to true:

>>> for  f in os.listdir('eggs'):
...     if f.startswith('demo'):
...         remove('eggs', f)
>>> write('buildout.cfg',
... '''
... [buildout]
... parts = eggs
... download-cache = %(cache)s
... install-from-cache = true
... find-links = %(link_server)s
...
... [eggs]
... recipe = zc.recipe.egg
... eggs = demo
... ''' % globals())
>>> print_(system(buildout), end='')
Uninstalling eggs.
Installing eggs.
Getting distribution for 'demo'.
Got demo 0.2.
Getting distribution for 'demoneeded'.
Got demoneeded 1.1.
Generated script '/sample-buildout/bin/demo'.

Caching extended configuration

As mentioned in the general buildout documentation, configuration files can extend each other, including the ability to download configuration being extended from a URL. If desired, zc.buildout caches downloaded configuration in order to be able to use it when run offline.

As we’re going to talk about downloading things, let’s start an HTTP server. Also, all of the following will take place inside the sample buildout.

>>> server_data = tmpdir('server_data')
>>> server_url = start_server(server_data)
>>> cd(sample_buildout)

We also use a fresh directory for temporary files in order to make sure that all temporary files have been cleaned up in the end:

>>> import tempfile
>>> old_tempdir = tempfile.tempdir
>>> tempfile.tempdir = tmpdir('tmp')

Basic use of the extends cache

We put some base configuration on a server and reference it from a sample buildout:

>>> write(server_data, 'base.cfg', """\
... [buildout]
... parts =
... foo = bar
... """)
>>> write('buildout.cfg', """\
... [buildout]
... extends = %sbase.cfg
... """ % server_url)

When trying to run this buildout offline, we’ll find that we cannot read all of the required configuration:

>>> print_(system(buildout + ' -o'))
While:
  Initializing.
Error: Couldn't download 'http://localhost/base.cfg' in offline mode.

Trying the same online, we can:

>>> print_(system(buildout))
Unused options for buildout: 'foo'.

As long as we haven’t said anything about caching downloaded configuration, nothing gets cached. Offline mode will still cause the buildout to fail:

>>> print_(system(buildout + ' -o'))
While:
  Initializing.
Error: Couldn't download 'http://localhost/base.cfg' in offline mode.

Let’s now specify a cache for base configuration files. This cache is different from the download cache used by recipes for caching distributions and other files; one might, however, use a namespace subdirectory of the download cache for it. The configuration cache we specify will be created when running buildout and the base.cfg file will be put in it (with the file name being a hash of the complete URL):

>>> mkdir('cache')
>>> write('buildout.cfg', """\
... [buildout]
... extends = %sbase.cfg
... extends-cache = cache
... """ % server_url)
>>> print_(system(buildout))
Unused options for buildout: 'foo'.
>>> cache = join(sample_buildout, 'cache')
>>> ls(cache)
-  5aedc98d7e769290a29d654a591a3a45
>>> import os
>>> cat(cache, os.listdir(cache)[0])
[buildout]
parts =
foo = bar

We can now run buildout offline as it will read base.cfg from the cache:

>>> print_(system(buildout + ' -o'))
Unused options for buildout: 'foo'.

The cache is being used purely as a fall-back in case we are offline or don’t have access to a configuration file to be downloaded. As long as we are online, buildout attempts to download a fresh copy of each file even if a cached copy of the file exists. To see this, we put different configuration in the same place on the server and run buildout in offline mode so it takes base.cfg from the cache:

>>> write(server_data, 'base.cfg', """\
... [buildout]
... parts =
... bar = baz
... """)
>>> print_(system(buildout + ' -o'))
Unused options for buildout: 'foo'.

In online mode, buildout will download and use the modified version:

>>> print_(system(buildout))
Unused options for buildout: 'bar'.

Trying offline mode again, the new version will be used as it has been put in the cache now:

>>> print_(system(buildout + ' -o'))
Unused options for buildout: 'bar'.

Clean up:

>>> rmdir(cache)

Specifying extends cache and offline mode

Normally, the values of buildout options such as the location of a download cache or whether to use offline mode are determined by first reading the user’s default configuration, updating it with the project’s configuration and finally applying command-line options. User and project configuration are assembled by reading a file such as ~/.buildout/default.cfg, buildout.cfg or a URL given on the command line, recursively (depth-first) downloading any base configuration specified by the buildout:extends option read from each of those config files, and finally evaluating each config file to provide default values for options not yet read.

This works fine for all options that do not influence how configuration is downloaded in the first place. The extends-cache and offline options, however, are treated differently from the procedure described in order to make it simple and obvious to see where a particular configuration file came from under any particular circumstances.

  • Offline and extends-cache settings are read from the two root config files exclusively. Otherwise one could construct configuration files that, when read, imply that they should have been read from a different source than they have. Also, specifying the extends cache within a file that might have to be taken from the cache before being read wouldn’t make a lot of sense.

  • Offline and extends-cache settings given by the user’s defaults apply to the process of assembling the project’s configuration. If no extends cache has been specified by the user’s default configuration, the project’s root config file must be available, be it from disk or from the net.

  • Offline mode turned on by the -o command line option is honored from the beginning even though command line options are applied to the configuration last. If offline mode is not requested by the command line, it may be switched on by either the user’s or the project’s config root.

Extends cache

Let’s see the above rules in action. We create a new home directory for our user and write user and project configuration that recursively extends online bases, using different caches:

>>> mkdir('home')
>>> mkdir('home', '.buildout')
>>> mkdir('cache')
>>> mkdir('user-cache')
>>> os.environ['HOME'] = join(sample_buildout, 'home')
>>> write('home', '.buildout', 'default.cfg', """\
... [buildout]
... extends = fancy_default.cfg
... extends-cache = user-cache
... """)
>>> write('home', '.buildout', 'fancy_default.cfg', """\
... [buildout]
... extends = %sbase_default.cfg
... """ % server_url)
>>> write(server_data, 'base_default.cfg', """\
... [buildout]
... foo = bar
... offline = false
... """)
>>> write('buildout.cfg', """\
... [buildout]
... extends = fancy.cfg
... extends-cache = cache
... """)
>>> write('fancy.cfg', """\
... [buildout]
... extends = %sbase.cfg
... """ % server_url)
>>> write(server_data, 'base.cfg', """\
... [buildout]
... parts =
... offline = false
... """)

Buildout will now assemble its configuration from all of these 6 files, defaults first. The online resources end up in the respective extends caches:

>>> print_(system(buildout))
Unused options for buildout: 'foo'.
>>> ls('user-cache')
-  10e772cf422123ef6c64ae770f555740
>>> cat('user-cache', os.listdir('user-cache')[0])
[buildout]
foo = bar
offline = false
>>> ls('cache')
-  c72213127e6eb2208a3e1fc1dba771a7
>>> cat('cache', os.listdir('cache')[0])
[buildout]
parts =
offline = false

If, on the other hand, the extends caches are specified in files that get extended themselves, they won’t be used for assembling the configuration they belong to (user’s or project’s, resp.). The extends cache specified by the user’s defaults does, however, apply to downloading project configuration. Let’s rewrite the config files, clean out the caches and re-run buildout:

>>> write('home', '.buildout', 'default.cfg', """\
... [buildout]
... extends = fancy_default.cfg
... """)
>>> write('home', '.buildout', 'fancy_default.cfg', """\
... [buildout]
... extends = %sbase_default.cfg
... extends-cache = user-cache
... """ % server_url)
>>> write('buildout.cfg', """\
... [buildout]
... extends = fancy.cfg
... """)
>>> write('fancy.cfg', """\
... [buildout]
... extends = %sbase.cfg
... extends-cache = cache
... """ % server_url)
>>> remove('user-cache', os.listdir('user-cache')[0])
>>> remove('cache', os.listdir('cache')[0])
>>> print_(system(buildout))
Unused options for buildout: 'foo'.
>>> ls('user-cache')
-  0548bad6002359532de37385bb532e26
>>> cat('user-cache', os.listdir('user-cache')[0])
[buildout]
parts =
offline = false
>>> ls('cache')

Clean up:

>>> rmdir('user-cache')
>>> rmdir('cache')

Offline mode and installation from cache —————————-~~~~~~~~~~~~

If we run buildout in offline mode now, it will fail because it cannot get at the remote configuration file needed by the user’s defaults:

>>> print_(system(buildout + ' -o'))
While:
  Initializing.
Error: Couldn't download 'http://localhost/base_default.cfg' in offline mode.

Let’s now successively turn on offline mode by different parts of the configuration and see when buildout applies this setting in each case:

>>> write('home', '.buildout', 'default.cfg', """\
... [buildout]
... extends = fancy_default.cfg
... offline = true
... """)
>>> print_(system(buildout))
While:
  Initializing.
Error: Couldn't download 'http://localhost/base_default.cfg' in offline mode.
>>> write('home', '.buildout', 'default.cfg', """\
... [buildout]
... extends = fancy_default.cfg
... """)
>>> write('home', '.buildout', 'fancy_default.cfg', """\
... [buildout]
... extends = %sbase_default.cfg
... offline = true
... """ % server_url)
>>> print_(system(buildout))
While:
  Initializing.
Error: Couldn't download 'http://localhost/base.cfg' in offline mode.
>>> write('home', '.buildout', 'fancy_default.cfg', """\
... [buildout]
... extends = %sbase_default.cfg
... """ % server_url)
>>> write('buildout.cfg', """\
... [buildout]
... extends = fancy.cfg
... offline = true
... """)
>>> print_(system(buildout))
While:
  Initializing.
Error: Couldn't download 'http://localhost/base.cfg' in offline mode.
>>> write('buildout.cfg', """\
... [buildout]
... extends = fancy.cfg
... """)
>>> write('fancy.cfg', """\
... [buildout]
... extends = %sbase.cfg
... offline = true
... """ % server_url)
>>> print_(system(buildout))
Unused options for buildout: 'foo'.

The install-from-cache option is treated accordingly:

>>> write('home', '.buildout', 'default.cfg', """\
... [buildout]
... extends = fancy_default.cfg
... install-from-cache = true
... """)
>>> print_(system(buildout))
While:
  Initializing.
Error: Couldn't download 'http://localhost/base_default.cfg' in offline mode.
>>> write('home', '.buildout', 'default.cfg', """\
... [buildout]
... extends = fancy_default.cfg
... """)
>>> write('home', '.buildout', 'fancy_default.cfg', """\
... [buildout]
... extends = %sbase_default.cfg
... install-from-cache = true
... """ % server_url)
>>> print_(system(buildout))
While:
  Initializing.
Error: Couldn't download 'http://localhost/base.cfg' in offline mode.
>>> write('home', '.buildout', 'fancy_default.cfg', """\
... [buildout]
... extends = %sbase_default.cfg
... """ % server_url)
>>> write('buildout.cfg', """\
... [buildout]
... extends = fancy.cfg
... install-from-cache = true
... """)
>>> print_(system(buildout))
While:
  Initializing.
Error: Couldn't download 'http://localhost/base.cfg' in offline mode.
>>> write('buildout.cfg', """\
... [buildout]
... extends = fancy.cfg
... """)
>>> write('fancy.cfg', """\
... [buildout]
... extends = %sbase.cfg
... install-from-cache = true
... """ % server_url)
>>> print_(system(buildout))
While:
  Installing.
  Checking for upgrades.
An internal error occurred ...
ValueError: install_from_cache set to true with no download cache
>>> rmdir('home', '.buildout')

Newest and non-newest behavior for extends cache

While offline mode forbids network access completely, ‘newest’ mode determines whether to look for updated versions of a resource even if some version of it is already present locally. If we run buildout in newest mode (newest = true), the configuration files are updated with each run:

>>> mkdir("cache")
>>> write(server_data, 'base.cfg', """\
... [buildout]
... parts =
... """)
>>> write('buildout.cfg', """\
... [buildout]
... extends-cache = cache
... extends = %sbase.cfg
... """ % server_url)
>>> print_(system(buildout))
>>> ls('cache')
-  5aedc98d7e769290a29d654a591a3a45
>>> cat('cache', os.listdir(cache)[0])
[buildout]
parts =

A change to base.cfg is picked up on the next buildout run:

>>> write(server_data, 'base.cfg', """\
... [buildout]
... parts =
... foo = bar
... """)
>>> print_(system(buildout + " -n"))
Unused options for buildout: 'foo'.
>>> cat('cache', os.listdir(cache)[0])
[buildout]
parts =
foo = bar

In contrast, when not using newest mode (newest = false), the files already present in the extends cache will not be updated:

>>> write(server_data, 'base.cfg', """\
... [buildout]
... parts =
... """)
>>> print_(system(buildout + " -N"))
Unused options for buildout: 'foo'.
>>> cat('cache', os.listdir(cache)[0])
[buildout]
parts =
foo = bar

Even when updating base configuration files with a buildout run, any given configuration file will be downloaded only once during that particular run. If some base configuration file is extended more than once, its cached copy is used:

>>> write(server_data, 'baseA.cfg', """\
... [buildout]
... extends = %sbase.cfg
... foo = bar
... """ % server_url)
>>> write(server_data, 'baseB.cfg', """\
... [buildout]
... extends-cache = cache
... extends = %sbase.cfg
... bar = foo
... """ % server_url)
>>> write('buildout.cfg', """\
... [buildout]
... extends-cache = cache
... newest = true
... extends = %sbaseA.cfg %sbaseB.cfg
... """ % (server_url, server_url))
>>> print_(system(buildout + " -n"))
Unused options for buildout: 'bar' 'foo'.

(XXX We patch download utility’s API to produce readable output for the test; a better solution would re-use the logging already done by the utility.)

>>> import zc.buildout
>>> old_download = zc.buildout.download.Download.download
>>> def wrapper_download(self, url, md5sum=None, path=None):
...   print_("The URL %s was downloaded." % url)
...   return old_download(url, md5sum, path)
>>> zc.buildout.download.Download.download = wrapper_download
>>> zc.buildout.buildout.main([])
The URL http://localhost/baseA.cfg was downloaded.
The URL http://localhost/base.cfg was downloaded.
The URL http://localhost/baseB.cfg was downloaded.
Not upgrading because not running a local buildout command.
Unused options for buildout: 'bar' 'foo'.
>>> zc.buildout.download.Download.download = old_download

The deprecated extended-by option

The buildout section used to recognize an option named extended-by that was deprecated at some point and removed in the 1.5 line. Since ignoring this option silently was considered harmful as a matter of principle, a UserError is raised if that option is encountered now:

>>> write(server_data, 'base.cfg', """\
... [buildout]
... parts =
... extended-by = foo.cfg
... """)
>>> print_(system(buildout))
While:
  Initializing.
Error: No-longer supported "extended-by" option found in http://localhost/base.cfg.

Clean up

We should have cleaned up all temporary files created by downloading things:

>>> ls(tempfile.tempdir)

Reset the global temporary directory:

>>> tempfile.tempdir = old_tempdir

Using zc.buildout to run setup scripts

zc buildout has a convenience command for running setup scripts. Why? There are two reasons. If a setup script doesn’t import setuptools, you can’t use any setuptools-provided commands, like bdist_egg. When buildout runs a setup script, it arranges to import setuptools before running the script so setuptools-provided commands are available.

If you use a squeaky-clean Python to do your development, the setup script that would import setuptools because setuptools isn’t in the path. Because buildout requires setuptools and knows where it has installed a setuptools egg, it adds the setuptools egg to the Python path before running the script. To run a setup script, use the buildout setup command, passing the name of a script or a directory containing a setup script and arguments to the script. Let’s look at an example:

>>> mkdir('test')
>>> cd('test')
>>> write('setup.py',
... '''
... from distutils.core import setup
... setup(name='sample')
... ''')

We’ve created a super simple (stupid) setup script. Note that it doesn’t import setuptools. Let’s try running it to create an egg. We’ll use the buildout script from our sample buildout:

>>> print_(system(buildout+' setup'), end='')
... # doctest: +NORMALIZE_WHITESPACE
Error: The setup command requires the path to a setup script or
directory containing a setup script, and its arguments.

Oops, we forgot to give the name of the setup script:

>>> print_(system(buildout+' setup setup.py bdist_egg'))
... # doctest: +ELLIPSIS
Running setup script 'setup.py'.
...
>>> ls('dist')
-  sample-0.0.0-py2.5.egg

Note that we can specify a directory name. This is often shorter and preferred by the lazy :)

>>> print_(system(buildout+' setup . bdist_egg')) # doctest: +ELLIPSIS
Running setup script './setup.py'.
...

Automatic Buildout Updates

When a buildout is run, one of the first steps performed is to check for updates to either zc.buildout or setuptools. To demonstrate this, we’ve created some “new releases” of buildout and setuptools in a new_releases folder:

>>> ls(new_releases)
d  setuptools
-  setuptools-99.99-py2.4.egg
d  zc.buildout
-  zc.buildout-99.99-py2.4.egg

Let’s update the sample buildout.cfg to look in this area:

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... find-links = %(new_releases)s
... index = %(new_releases)s
... parts = show-versions
... develop = showversions
...
... [show-versions]
... recipe = showversions
... """ % dict(new_releases=new_releases))

We’ll also include a recipe that echos the versions of setuptools and zc.buildout used:

>>> mkdir(sample_buildout, 'showversions')
>>> write(sample_buildout, 'showversions', 'showversions.py',
... """
... import pkg_resources
... import sys
... print_ = lambda *a: sys.stdout.write(' '.join(map(str, a))+'\\n')
...
... class Recipe:
...
...     def __init__(self, buildout, name, options):
...         pass
...
...     def install(self):
...         for project in 'zc.buildout', 'setuptools':
...             req = pkg_resources.Requirement.parse(project)
...             print_(project, pkg_resources.working_set.find(req).version)
...         return ()
...     update = install
... """)
>>> write(sample_buildout, 'showversions', 'setup.py',
... """
... from setuptools import setup
...
... setup(
...     name = "showversions",
...     entry_points = {'zc.buildout': ['default = showversions:Recipe']},
...     )
... """)

Now if we run the buildout, the buildout will upgrade itself to the new versions found in new releases:

>>> print_(system(buildout), end='')
Getting distribution for 'zc.buildout>=1.99'.
Got zc.buildout 99.99.
Getting distribution for 'setuptools'.
Got setuptools 99.99.
Upgraded:
  zc.buildout version 99.99,
  setuptools version 99.99;
restarting.
Generated script '/sample-buildout/bin/buildout'.
Develop: '/sample-buildout/showversions'
Installing show-versions.
zc.buildout 99.99
setuptools 99.99

Our buildout script has been updated to use the new eggs:

>>> cat(sample_buildout, 'bin', 'buildout')
#!/usr/local/bin/python2.7
<BLANKLINE>
import sys
sys.path[0:0] = [
  '/sample-buildout/eggs/zc.buildout-99.99-py2.4.egg',
  '/sample-buildout/eggs/setuptools-99.99-py2.4.egg',
  ]
<BLANKLINE>
import zc.buildout.buildout
<BLANKLINE>
if __name__ == '__main__':
    sys.exit(zc.buildout.buildout.main())

Now, let’s recreate the sample buildout. If we specify constraints on the versions of zc.buildout and setuptools to use, running the buildout will install earlier versions of these packages:

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... find-links = %(new_releases)s
... index = %(new_releases)s
... parts = show-versions
... develop = showversions
...
... [versions]
... zc.buildout = < 99
... setuptools = < 99
...
... [show-versions]
... recipe = showversions
... """ % dict(new_releases=new_releases))

Now we can see that we actually “upgrade” to an earlier version.

>>> print_(system(buildout), end='')
Upgraded:
  zc.buildout version 1.4.4;
  setuptools version 0.6;
restarting.
Generated script '/sample-buildout/bin/buildout'.
Develop: '/sample-buildout/showversions'
Updating show-versions.
zc.buildout 1.0.0
setuptools 0.6

There are a number of cases, described below, in which the updates don’t happen.

We won’t upgrade in offline mode:

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... find-links = %(new_releases)s
... index = %(new_releases)s
... parts = show-versions
... develop = showversions
...
... [show-versions]
... recipe = showversions
... """ % dict(new_releases=new_releases))
>>> print_(system(buildout+' -o'), end='')
Develop: '/sample-buildout/showversions'
Updating show-versions.
zc.buildout 1.0.0
setuptools 0.6

Or in non-newest mode:

>>> print_(system(buildout+' -N'), end='')
Develop: '/sample-buildout/showversions'
Updating show-versions.
zc.buildout 1.0.0
setuptools 0.6

We also won’t upgrade if the buildout script being run isn’t in the buildouts bin directory. To see this we’ll create a new buildout directory:

>>> sample_buildout2 = tmpdir('sample_buildout2')
>>> write(sample_buildout2, 'buildout.cfg',
... """
... [buildout]
... find-links = %(new_releases)s
... index = %(new_releases)s
... parts =
... """ % dict(new_releases=new_releases))
>>> cd(sample_buildout2)
>>> print_(system(buildout), end='')
Creating directory '/sample_buildout2/bin'.
Creating directory '/sample_buildout2/parts'.
Creating directory '/sample_buildout2/eggs'.
Creating directory '/sample_buildout2/develop-eggs'.
Getting distribution for 'zc.buildout>=1.99'.
Got zc.buildout 99.99.
Getting distribution for 'setuptools'.
Got setuptools 99.99.
Not upgrading because not running a local buildout command.
>>> ls('bin')

When buildout restarts and the restarted buildout exits with an error code, the original buildout that called the second buildout also exits with that error code. Otherwise build scripts can erroneously detect a successful buildout run even if it failed.

Make a recipe that fails:

>>> mkdir(sample_buildout, 'failrecipe')
>>> write(sample_buildout, 'failrecipe', 'failrecipe.py',
... """
... import pkg_resources
... import sys
... print_ = lambda *a: sys.stdout.write(' '.join(map(str, a))+'\\n')
...
... class Recipe:
...
...     def __init__(self, buildout, name, options):
...         sys.exit('recipe sys-exits')
...
...     def install(self):
...         pass
...
...     update = install
... """)
>>> write(sample_buildout, 'failrecipe', 'setup.py',
... """
... from setuptools import setup
...
... setup(
...     name = "failrecipe",
...     entry_points = {'zc.buildout': ['default = failrecipe:Recipe']},
...     )
... """)

Let’s downgrade again, triggering a restart. And use the failing recipe that gives us a sys.exit:

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... find-links = %(new_releases)s
... index = %(new_releases)s
... parts = fail
... develop = failrecipe
...
... [versions]
... zc.buildout = < 99
... setuptools = < 99
...
... [fail]
... recipe = failrecipe
... """ % dict(new_releases=new_releases))

Run the buildout:

>>> print_(system(buildout, with_exit_code=True), end='')
Upgraded:
  zc.buildout version 1.4.4;
  setuptools version 0.6;
restarting.
Generated script '/sample-buildout/bin/buildout'.
Develop: '/sample-buildout/failrecipe'
recipe sys-exits
EXIT CODE: 1

Debugging buildouts

Buildouts can be pretty complex. When things go wrong, it isn’t always obvious why. Errors can occur due to problems in user input or due to bugs in zc.buildout or recipes. When an error occurs, Python’s post-mortem debugger can be used to inspect the state of the buildout or recipe code were there error occurred. To enable this, use the -D option to the buildout. Let’s create a recipe that has a bug:

>>> mkdir(sample_buildout, 'recipes')
>>> write(sample_buildout, 'recipes', 'mkdir.py',
... """
... import os, zc.buildout
...
... class Mkdir:
...
...     def __init__(self, buildout, name, options):
...         self.name, self.options = name, options
...         options['path'] = os.path.join(
...                               buildout['buildout']['directory'],
...                               options['path'],
...                               )
...
...     def install(self):
...         directory = self.options['directory']
...         os.mkdir(directory)
...         return directory
...
...     def update(self):
...         pass
... """)
>>> write(sample_buildout, 'recipes', 'setup.py',
... """
... from setuptools import setup
...
... setup(name = "recipes",
...       entry_points = {'zc.buildout': ['mkdir = mkdir:Mkdir']},
...       )
... """)

And create a buildout that uses it:

>>> write(sample_buildout, 'buildout.cfg',
... """
... [buildout]
... develop = recipes
... parts = data-dir
...
... [data-dir]
... recipe = recipes:mkdir
... path = mystuff
... """)

If we run the buildout, we’ll get an error:

>>> print_(system(buildout), end='')
Develop: '/sample-buildout/recipes'
Installing data-dir.
While:
  Installing data-dir.
Error: Missing option: data-dir:directory

If we want to debug the error, we can add the -D option. Here’s we’ll supply some input:

>>> print_(system(buildout+" -D", """\
... up
... p sorted(self.options.keys())
... q
... """), end='')
Develop: '/sample-buildout/recipes'
Installing data-dir.
> /zc/buildout/buildout.py(925)__getitem__()
-> raise MissingOption("Missing option: %s:%s" % (self.name, key))
(Pdb) > /sample-buildout/recipes/mkdir.py(14)install()
-> directory = self.options['directory']
(Pdb) ['path', 'recipe']
(Pdb) While:
  Installing data-dir.
Traceback (most recent call last):
  File "/zc/buildout/buildout.py", line 1352, in main
    getattr(buildout, command)(args)
  File "/zc/buildout/buildout.py", line 383, in install
    installed_files = self[part]._call(recipe.install)
  File "/zc/buildout/buildout.py", line 961, in _call
    return f()
  File "/sample-buildout/recipes/mkdir.py", line 14, in install
    directory = self.options['directory']
  File "/zc/buildout/buildout.py", line 925, in __getitem__
    raise MissingOption("Missing option: %s:%s" % (self.name, key))
MissingOption: Missing option: data-dir:directory
<BLANKLINE>
Starting pdb:

Meta-recipe support

Buildout recipes provide reusable Python modules for common configuration tasks. The most widely used recipes tend to provide low-level functions, like installing eggs or software distributions, creating configuration files, and so on. The normal recipe framework is fairly well suited to building these general components.

Full-blown applications may require many, often tens, of parts. Defining the many parts that make up an application can be tedious and often entails a lot of repetition. Buildout provides a number of mechanisms to avoid repetition, including merging of configuration files and macros, but these, while useful to an extent, don’t scale very well. Buildout isn’t and shouldn’t be a programming language.

Meta-recipes allow us to bring Python to bear to provide higher-level abstractions for buildouts.

A meta-recipe is a regular Python recipe that primarily operates by creating parts. A meta recipe isn’t merely a high level recipe. It’s a recipe that defers most or all of it’s work to lower-level recipes by manipulating the buildout database.

A presentation at PyCon 2011 described early work with meta recipes.

A simple meta-recipe example

Let’s look at a fairly simple meta-recipe example. First, consider a buildout configuration that builds a database deployment:

[buildout]
parts = ctl pack

[deployment]
recipe = zc.recipe.deployment
name = ample
user = zope

[ctl]
recipe = zc.recipe.rhrc
deployment = deployment
chkconfig = 345 99 10
parts = main

[main]
recipe = zc.zodbrecipes:server
deployment = deployment
address = 8100
path = /var/databases/ample/main.fs
zeo.conf =
   <zeo>
      address ${:address}
   </zeo>
   %import zc.zlibstorage
   <zlibstorage>
     <filestorage>
        path ${:path}
     </filestorage>
   </zlibstorage>

[pack]
recipe = zc.recipe.deployment:crontab
deployment = deployment
times = 1 2 * * 6
command = ${buildout:bin-directory}/zeopack -d3 -t00 ${main:address}

This buildout doesn’t build software. Rather it builds configuration for deploying a database configuration using already-deployed software. For the purpose of this document, however, the details are totally unimportant.

Rather than crafting the configuration above every time, we can write a meta-recipe that crafts it for us. We’ll use our meta-recipe as follows:

[buildout]
parts = ample

[ample]
recipe = com.example.ample:db
path = /var/databases/ample/main.fs

The idea here is that the meta recipe allows us to specify the minimal information necessary. A meta-recipe often automates policies and assumptions that are application and organization dependent. The example above assumes, for example, that we want to pack to 3 days in the past on Saturdays.

So now, let’s see the meta recipe that automates this:

class Recipe:

    def __init__(self, buildout, name, options):

        buildout.parse('''
            [deployment]
            recipe = zc.recipe.deployment
            name = %s
            user = zope
            ''' % name)

        buildout['main'] = dict(
            recipe = 'zc.zodbrecipes:server',
            deployment = 'deployment',
            address = 8100,
            path = options['path'],
            **{
              'zeo.conf': '''
                <zeo>
                  address ${:address}
                </zeo>

                %import zc.zlibstorage

                <zlibstorage>
                  <filestorage>
                    path ${:path}
                  </filestorage>
                </zlibstorage>
                '''}
            )

        buildout.parse('''
            [pack]
            recipe = zc.recipe.deployment:crontab
            deployment = deployment
            times = 1 2 * * 6
            command =
              ${buildout:bin-directory}/zeopack -d3 -t00 ${main:address}

            [ctl]
            recipe = zc.recipe.rhrc
            deployment = deployment
            chkconfig = 345 99 10
            parts = main
            ''')

    def install(self):
        pass

    update = install

The meta recipe just adds parts to the buildout. It does this by setting items and and calling the parse method. The parse method just takes a string in buildout configuration syntax. It’s useful when we want to add static, or nearly static part data. The setting items syntax is useful when we have non-trivial computation for part data.

The order that we add parts is important. When adding a part, any string substitutions and other dependencies are evaluated, so the referenced parts must be defined first. This is why, for example, the pack part is added after the main part.

Note that the meta recipe supplied an integer for one of the options. In addition to strings, it’s legal to supply integer values.

There are a few things to note about this example:

  • The install and update methods are empty.

    While not required, this is a very common pattern for meta recipes. Most meta recipes, simply invoke other recipes.

  • Setting a buildout item or calling parse, adds any sections with recipes as parts.

  • An exception will be raised if a section already exists.

Testing

Now, let’s test our meta recipe. We’ll test it without actually running buildout. Rather, we’ll use a specialized buildout provided by the zc.buildout.testing module.

>>> import zc.buildout.testing
>>> buildout = zc.buildout.testing.Buildout()

The testing buildout is intended to be passed to recipes being tested:

>>> _ = Recipe(buildout, 'ample', dict(path='/var/databases/ample/main.fs'))

After running the recipe, we should see the buildout database populated by the recipe:

>>> buildout.print_options()
[ctl]
chkconfig = 345 99 10
deployment = deployment
parts = main
recipe = zc.recipe.rhrc
[deployment]
name = ample
recipe = zc.recipe.deployment
user = zope
[main]
address = 8100
deployment = deployment
path = /var/databases/ample/main.fs
recipe = zc.zodbrecipes:server
zeo.conf =
<BLANKLINE>
                  <zeo>
                    address 8100
                  </zeo>
<BLANKLINE>
                  %import zc.zlibstorage
<BLANKLINE>
                  <zlibstorage>
                    <filestorage>
                      path /var/databases/ample/main.fs
                    </filestorage>
                  </zlibstorage>
<BLANKLINE>
[pack]
command = /sample-buildout/bin/zeopack -d3 -t00 8100
deployment = deployment
recipe = zc.recipe.deployment:crontab
times = 1 2 * * 6

Testing Support

The zc.buildout.testing module provides an API that can be used when writing recipe tests. This API is documented below. Many examples of using this API can be found in the zc.buildout, zc.recipe.egg, and zc.recipe.testrunner tests.

zc.buildout.testing.buildoutSetUp(test)

The buildoutSetup function can be used as a doctest setup function. It creates a sample buildout that can be used by tests, changing the current working directory to the sample_buildout. It also adds a number of names to the test namespace:

sample_buildout

This is the name of a buildout with a basic configuration.

buildout

This is the path of the buildout script in the sample buildout.

ls(*path)

List the contents of a directory. The directory path is provided as one or more strings, to be joined with os.path.join.

cat(*path)

Display the contents of a file. The file path is provided as one or more strings, to be joined with os.path.join.

On Windows, if the file doesn’t exist, the function will try adding a ‘-script.py’ suffix. This helps to work around a difference in script generation on windows.

mkdir(*path)

Create a directory. The directory path is provided as one or more strings, to be joined with os.path.join.

rmdir(*path)

Remove a directory. The directory path is provided as one or more strings, to be joined with os.path.join.

remove(*path)

Remove a directory or file. The path is provided as one or more strings, to be joined with os.path.join.

tmpdir(name)

Create a temporary directory with the given name. The directory will be automatically removed at the end of the test. The path of the created directory is returned.

Further, if the the normalize_path normalizing substitution (see below) is used, then any paths starting with this path will be normalized to:

/name/restofpath

No two temporary directories can be created with the same name. A directory created with tmpdir can be removed with rmdir and recreated.

Note that the sample_buildout directory is created by calling this function.

write(*path_and_contents)

Create a file. The file path is provided as one or more strings, to be joined with os.path.join. The last argument is the file contents.

system(command, input='')

Execute a system command with the given input passed to the command’s standard input. The output (error and regular output) from the command is returned.

get(url)

Get a web page.

cd(*path)

Change to the given directory. The directory path is provided as one or more strings, to be joined with os.path.join.

The directory will be reset at the end of the test.

uncd()

Change to the directory that was current prior to the previous call to cd. You can call cd multiple times and then uncd the same number of times to return to the same location.

join(*path)

A convenient reference to os.path.join.

register_teardown(func)

Register a tear-down function. The function will be called with no arguments at the end of the test.

start_server(path)

Start a web server on the given path. The server will be shut down at the end of the test. The server URL is returned.

You can cause the server to start and stop logging it’s output using:

>>> get(server_url+'enable_server_logging')

and:

>>> get(server_url+'disable_server_logging')

This can be useful to see how buildout is interacting with a server.

sdist(setup, dest)

Create a source distribution by running the given setup file and placing the result in the given destination directory. If the setup argument is a directory, the setup.py file in that directory is used.

bdist_egg(setup, dest)

Create an egg by running the given setup file and placing the result in the given destination directory. If the setup argument is a directory, then the setup.py file in that directory is used.

zc.buildout.testing.buildoutTearDown(test)

Tear down everything set up by zc.buildout.testing.buildoutSetUp. Any functions passed to register_teardown are called as well.

install(project, destination)

Install eggs for a given project into a destination. If the destination is a test object, then the eggs directory of the sample buildout (sample_buildout) defined by the test will be used. Tests will use this to install the distributions for the packages being tested (and their dependencies) into a sample buildout. The egg to be used should already be loaded, by importing one of the modules provided, before calling this function.

install_develop(project, destination)

Like install, but a develop egg is installed even if the current egg if not a develop egg.

Output normalization

Recipe tests often generate output that is dependent on temporary file locations, operating system conventions or Python versions. To deal with these dependencies, we often use zope.testing.renormalizing.RENormalizing to normalize test output. zope.testing.renormalizing.RENormalizing takes pairs of regular expressions and substitutions. The zc.buildout.testing module provides a few helpful variables that define regular-expression/substitution pairs that you can pass to zope.testing.renormalizing.RENormalizing.

normalize_path

Converts tests paths, based on directories created with tmpdir(), to simple paths.

normalize_script

On Unix-like systems, scripts are implemented in single files without suffixes. On windows, scripts are implemented with 2 files, a -script.py file and a .exe file. This normalization converts directory listings of Windows scripts to the form generated on UNix-like systems.

normalize_egg_py

Normalize Python version and platform indicators, if specified, in egg names.

Python API for egg and script installation

The easy_install module provides some functions to provide support for egg and script installation. It provides functionality at the python level that is similar to easy_install, with a few exceptions:

  • By default, we look for new packages and the packages that they depend on. This is somewhat like (and uses) the –upgrade option of easy_install, except that we also upgrade required packages.

  • If the highest-revision package satisfying a specification is already present, then we don’t try to get another one. This saves a lot of search time in the common case that packages are pegged to specific versions.

  • If there is a develop egg that satisfies a requirement, we don’t look for additional distributions. We always give preference to develop eggs.

  • Distutils options for building extensions can be passed.

Distribution installation

The easy_install module provides a function, install, for installing one or more packages and their dependencies. The install function takes 2 positional arguments:

  • An iterable of setuptools requirement strings for the distributions to be installed, and

  • A destination directory to install to and to satisfy requirements from. The destination directory can be None, in which case, no new distributions are downloaded and there will be an error if the needed distributions can’t be found among those already installed.

It supports a number of optional keyword arguments:

links

A sequence of URLs, file names, or directories to look for links to distributions.

index

The URL of an index server, or almost any other valid URL. :)

If not specified, the Python Package Index, http://pypi.python.org/simple/, is used. You can specify an alternate index with this option. If you use the links option and if the links point to the needed distributions, then the index can be anything and will be largely ignored. In the examples, here, we’ll just point to an empty directory on our link server. This will make our examples run a little bit faster.

path

A list of additional directories to search for locally-installed distributions.

working_set

An existing working set to be augmented with additional distributions, if necessary to satisfy requirements. This allows you to call install multiple times, if necessary, to gather multiple sets of requirements.

newest

A boolean value indicating whether to search for new distributions when already-installed distributions meet the requirement. When this is true, the default, and when the destination directory is not None, then the install function will search for the newest distributions that satisfy the requirements.

versions

A dictionary mapping project names to version numbers to be used when selecting distributions. This can be used to specify a set of distribution versions independent of other requirements.

use_dependency_links

A flag indicating whether to search for dependencies using the setup dependency_links metadata or not. If true, links are searched for using dependency_links in preference to other locations. Defaults to true.

relative_paths

Adjust egg paths so they are relative to the script path. This allows scripts to work when scripts and eggs are moved, as long as they are both moved in the same way.

The install method returns a working set containing the distributions needed to meet the given requirements.

We have a link server that has a number of eggs:

>>> print_(get(link_server), end='')
<html><body>
<a href="bigdemo-0.1-py2.4.egg">bigdemo-0.1-py2.4.egg</a><br>
<a href="demo-0.1-py2.4.egg">demo-0.1-py2.4.egg</a><br>
<a href="demo-0.2-py2.4.egg">demo-0.2-py2.4.egg</a><br>
<a href="demo-0.3-py2.4.egg">demo-0.3-py2.4.egg</a><br>
<a href="demo-0.4c1-py2.4.egg">demo-0.4c1-py2.4.egg</a><br>
<a href="demoneeded-1.0.zip">demoneeded-1.0.zip</a><br>
<a href="demoneeded-1.1.zip">demoneeded-1.1.zip</a><br>
<a href="demoneeded-1.2c1.zip">demoneeded-1.2c1.zip</a><br>
<a href="du_zipped-1.0-pyN.N.egg">du_zipped-1.0-pyN.N.egg</a><br>
<a href="extdemo-1.4.zip">extdemo-1.4.zip</a><br>
<a href="index/">index/</a><br>
<a href="other-1.0-py2.4.egg">other-1.0-py2.4.egg</a><br>
</body></html>

Let’s make a directory and install the demo egg to it, using the demo:

>>> dest = tmpdir('sample-install')
>>> import zc.buildout.easy_install
>>> ws = zc.buildout.easy_install.install(
...     ['demo==0.2'], dest,
...     links=[link_server], index=link_server+'index/')

We requested version 0.2 of the demo distribution to be installed into the destination server. We specified that we should search for links on the link server and that we should use the (empty) link server index directory as a package index.

The working set contains the distributions we retrieved.

>>> for dist in ws:
...     print_(dist)
demo 0.2
demoneeded 1.1

We got demoneeded because it was a dependency of demo.

And the actual eggs were added to the eggs directory.

>>> ls(dest)
d  demo-0.2-py2.4.egg
d  demoneeded-1.1-py2.4.egg

If we remove the version restriction on demo, but specify a false value for newest, no new distributions will be installed:

>>> ws = zc.buildout.easy_install.install(
...     ['demo'], dest, links=[link_server], index=link_server+'index/',
...     newest=False)
>>> ls(dest)
d  demo-0.2-py2.4.egg
d  demoneeded-1.1-py2.4.egg

If we leave off the newest option, we’ll get an update for demo:

>>> ws = zc.buildout.easy_install.install(
...     ['demo'], dest, links=[link_server], index=link_server+'index/')
>>> ls(dest)
d  demo-0.2-py2.4.egg
d  demo-0.3-py2.4.egg
d  demoneeded-1.1-py2.4.egg

Note that we didn’t get the newest versions available. There were release candidates for newer versions of both packages. By default, final releases are preferred. We can change this behavior using the prefer_final function:

>>> zc.buildout.easy_install.prefer_final(False)
True

The old setting is returned.

>>> ws = zc.buildout.easy_install.install(
...     ['demo'], dest, links=[link_server], index=link_server+'index/')
>>> for dist in ws:
...     print_(dist)
demo 0.4c1
demoneeded 1.2c1
>>> ls(dest)
d  demo-0.2-py2.4.egg
d  demo-0.3-py2.4.egg
d  demo-0.4c1-py2.4.egg
d  demoneeded-1.1-py2.4.egg
d  demoneeded-1.2c1-py2.4.egg

Let’s put the setting back to the default.

>>> zc.buildout.easy_install.prefer_final(True)
False

We can supply additional distributions. We can also supply specifications for distributions that would normally be found via dependencies. We might do this to specify a specific version.

>>> ws = zc.buildout.easy_install.install(
...     ['demo', 'other', 'demoneeded==1.0'], dest,
...     links=[link_server], index=link_server+'index/')
>>> for dist in ws:
...     print_(dist)
demo 0.3
other 1.0
demoneeded 1.0
>>> ls(dest)
d  demo-0.2-py2.4.egg
d  demo-0.3-py2.4.egg
d  demo-0.4c1-py2.4.egg
d  demoneeded-1.0-py2.4.egg
d  demoneeded-1.1-py2.4.egg
d  demoneeded-1.2c1-py2.4.egg
d  other-1.0-py2.4.egg
>>> rmdir(dest)

Specifying version information independent of requirements

Sometimes it’s useful to specify version information independent of normal requirements specifications. For example, a buildout may need to lock down a set of versions, without having to put put version numbers in setup files or part definitions. If a dictionary is passed to the install function, mapping project names to version numbers, then the versions numbers will be used.

>>> ws = zc.buildout.easy_install.install(
...     ['demo'], dest, links=[link_server], index=link_server+'index/',
...     versions = dict(demo='0.2', demoneeded='1.0'))
>>> [d.version for d in ws]
['0.2', '1.0']

In this example, we specified a version for demoneeded, even though we didn’t define a requirement for it. The versions specified apply to dependencies as well as the specified requirements.

If we specify a version that’s incompatible with a requirement, then we’ll get an error:

>>> from zope.testing.loggingsupport import InstalledHandler
>>> handler = InstalledHandler('zc.buildout.easy_install')
>>> import logging
>>> logging.getLogger('zc.buildout.easy_install').propagate = False
>>> ws = zc.buildout.easy_install.install(
...     ['demo >0.2'], dest, links=[link_server],
...     index=link_server+'index/',
...     versions = dict(demo='0.2', demoneeded='1.0'))
Traceback (most recent call last):
...
IncompatibleConstraintError: Bad constraint 0.2 demo>0.2
>>> print_(handler)
zc.buildout.easy_install DEBUG
  Installing 'demo >0.2'.
zc.buildout.easy_install ERROR
  The constraint, 0.2, is not consistent with the requirement, 'demo>0.2'.
>>> handler.clear()

If no versions are specified, a debugging message will be output reporting that a version was picked automatically:

>>> ws = zc.buildout.easy_install.install(
...     ['demo'], dest, links=[link_server], index=link_server+'index/',
...     )
>>> print_(handler) # doctest: +ELLIPSIS
zc.buildout.easy_install DEBUG
  Installing 'demo'.
zc.buildout.easy_install INFO
  Getting distribution for 'demo'.
zc.buildout.easy_install INFO
  Got demo 0.3.
zc.buildout.easy_install DEBUG
  Picked: demo = 0.3
zc.buildout.easy_install DEBUG
  Getting required 'demoneeded'
zc.buildout.easy_install DEBUG
    required by demo 0.3.
zc.buildout.easy_install INFO
  Getting distribution for 'demoneeded'.
zc.buildout.easy_install DEBUG
  Running easy_install:...
zc.buildout.easy_install INFO
  Got demoneeded 1.1.
zc.buildout.easy_install DEBUG
  Picked: demoneeded = 1.1
zc.buildout.easy_install DEBUG

Installing ‘demo’.

zc.buildout.easy_install DEBUG

We have the best distribution that satisfies ‘demo’.

zc.buildout.easy_install DEBUG

Picked: demo = 0.3

zc.buildout.easy_install DEBUG

Getting required ‘demoneeded’

zc.buildout.easy_install DEBUG

required by demo 0.3.

zc.buildout.easy_install DEBUG

We have the best distribution that satisfies ‘demoneeded’.

zc.buildout.easy_install DEBUG

Picked: demoneeded = 1.1

>>> handler.uninstall()
>>> logging.getLogger('zc.buildout.easy_install').propagate = True

We can request that we get an error if versions are picked:

>>> zc.buildout.easy_install.allow_picked_versions(False)
True

(The old setting is returned.)

>>> ws = zc.buildout.easy_install.install(
...     ['demo'], dest, links=[link_server], index=link_server+'index/',
...     )
Traceback (most recent call last):
...
UserError: Picked: demo = 0.3
>>> zc.buildout.easy_install.allow_picked_versions(True)
False

The function default_versions can be used to get and set default version information to be used when no version information is passes. If called with an argument, it sets the default versions:

>>> zc.buildout.easy_install.default_versions(dict(demoneeded='1'))
... # doctest: +ELLIPSIS
{...}

It always returns the previous default versions. If called without an argument, it simply returns the default versions without changing them:

>>> zc.buildout.easy_install.default_versions()
{'demoneeded': '1'}

So with the default versions set, we’ll get the requested version even if the versions option isn’t used:

>>> ws = zc.buildout.easy_install.install(
...     ['demo'], dest, links=[link_server], index=link_server+'index/',
...     )
>>> [d.version for d in ws]
['0.3', '1.0']

Of course, we can unset the default versions by passing an empty dictionary:

>>> zc.buildout.easy_install.default_versions({})
{'demoneeded': '1'}
>>> ws = zc.buildout.easy_install.install(
...     ['demo'], dest, links=[link_server], index=link_server+'index/',
...     )
>>> [d.version for d in ws]
['0.3', '1.1']

Script generation

The easy_install module provides support for creating scripts from eggs. It provides a function similar to setuptools except that it provides facilities for baking a script’s path into the script. This has two advantages:

  • The eggs to be used by a script are not chosen at run time, making startup faster and, more importantly, deterministic.

  • The script doesn’t have to import pkg_resources because the logic that pkg_resources would execute at run time is executed at script-creation time.

The scripts method can be used to generate scripts. Let’s create a destination directory for it to place them in:

>>> import tempfile
>>> bin = tmpdir('bin')

Now, we’ll use the scripts method to generate scripts in this directory from the demo egg:

>>> import sys
>>> scripts = zc.buildout.easy_install.scripts(
...     ['demo'], ws, sys.executable, bin)

the three arguments we passed were:

  1. A sequence of distribution requirements. These are of the same form as setuptools requirements. Here we passed a single requirement, for the version 0.1 demo distribution.

  2. A working set,

  3. The destination directory.

The bin directory now contains a generated script:

>>> ls(bin)
-  demo

The return value is a list of the scripts generated:

>>> import os, sys
>>> if sys.platform == 'win32':
...     scripts == [os.path.join(bin, 'demo.exe'),
...                 os.path.join(bin, 'demo-script.py')]
... else:
...     scripts == [os.path.join(bin, 'demo')]
True

Note that in Windows, 2 files are generated for each script. A script file, ending in ‘-script.py’, and an exe file that allows the script to be invoked directly without having to specify the Python interpreter and without having to provide a ‘.py’ suffix.

The demo script run the entry point defined in the demo egg:

>>> cat(bin, 'demo') # doctest: +NORMALIZE_WHITESPACE
#!/usr/local/bin/python2.7
<BLANKLINE>
import sys
sys.path[0:0] = [
  '/sample-install/demo-0.3-py2.4.egg',
  '/sample-install/demoneeded-1.1-py2.4.egg',
  ]
<BLANKLINE>
import eggrecipedemo
<BLANKLINE>
if __name__ == '__main__':
    sys.exit(eggrecipedemo.main())

Some things to note:

  • The demo and demoneeded eggs are added to the beginning of sys.path.

  • The module for the script entry point is imported and the entry point, in this case, ‘main’, is run.

Rather than requirement strings, you can pass tuples containing 3 strings:

  • A script name,

  • A module,

  • An attribute expression for an entry point within the module.

For example, we could have passed entry point information directly rather than passing a requirement:

>>> scripts = zc.buildout.easy_install.scripts(
...     [('demo', 'eggrecipedemo', 'main')], ws,
...     sys.executable, bin)
>>> cat(bin, 'demo') # doctest: +NORMALIZE_WHITESPACE
#!/usr/local/bin/python2.7
<BLANKLINE>
import sys
sys.path[0:0] = [
  '/sample-install/demo-0.3-py2.4.egg',
  '/sample-install/demoneeded-1.1-py2.4.egg',
  ]
<BLANKLINE>
import eggrecipedemo
<BLANKLINE>
if __name__ == '__main__':
    sys.exit(eggrecipedemo.main())

Passing entry-point information directly is handy when using eggs (or distributions) that don’t declare their entry points, such as distributions that aren’t based on setuptools.

The interpreter keyword argument can be used to generate a script that can be used to invoke the Python interactive interpreter with the path set based on the working set. This generated script can also be used to run other scripts with the path set on the working set:

>>> scripts = zc.buildout.easy_install.scripts(
...     ['demo'], ws, sys.executable, bin, interpreter='py')
>>> ls(bin)
-  demo
-  py
>>> if sys.platform == 'win32':
...     scripts == [os.path.join(bin, 'demo.exe'),
...                 os.path.join(bin, 'demo-script.py'),
...                 os.path.join(bin, 'py.exe'),
...                 os.path.join(bin, 'py-script.py')]
... else:
...     scripts == [os.path.join(bin, 'demo'),
...                 os.path.join(bin, 'py')]
True

The py script simply runs the Python interactive interpreter with the path set:

>>> cat(bin, 'py') # doctest: +NORMALIZE_WHITESPACE +REPORT_NDIFF
#!/usr/local/bin/python2.7
<BLANKLINE>
import sys
<BLANKLINE>
sys.path[0:0] = [
  '/sample-install/demo-0.3-pyN.N.egg',
  '/sample-install/demoneeded-1.1-pyN.N.egg',
  ]
<BLANKLINE>
_interactive = True
if len(sys.argv) > 1:
    _options, _args = __import__("getopt").getopt(sys.argv[1:], 'ic:m:')
    _interactive = False
    for (_opt, _val) in _options:
        if _opt == '-i':
            _interactive = True
        elif _opt == '-c':
            exec(_val)
        elif _opt == '-m':
            sys.argv[1:] = _args
            _args = []
            __import__("runpy").run_module(
                 _val, {}, "__main__", alter_sys=True)
<BLANKLINE>
    if _args:
        sys.argv[:] = _args
        __file__ = _args[0]
        del _options, _args
        with open(__file__, 'U') as __file__f:
            exec(compile(__file__f.read(), __file__, "exec"))
<BLANKLINE>
if _interactive:
    del _interactive
    __import__("code").interact(banner="", local=globals())

If invoked with a script name and arguments, it will run that script, instead.

>>> write('ascript', r'''
... "demo doc"
... import sys
... print_ = lambda *a: sys.stdout.write(' '.join(map(str, a))+'\n')
... print_(sys.argv)
... print_((__name__, __file__, __doc__))
... ''')
>>> print_(system(join(bin, 'py')+' ascript a b c'), end='')
['ascript', 'a', 'b', 'c']
('__main__', 'ascript', 'demo doc')

For Python 2.5 and higher, you can also use the -m option to run a module:

>>> if sys.version_info < (2, 5):
...    print ('usage: pdb.py blah blah blah')
... else:
...    print_(system(join(bin, 'py')+' -m pdb'), end='')
... # doctest: +ELLIPSIS
usage: pdb.py ...
>>> print_(system(join(bin, 'py')+' -m pdb what'), end='')
Error: what does not exist

An interpreter can also be generated without other eggs:

>>> scripts = zc.buildout.easy_install.scripts(
...     [], [], sys.executable, bin, interpreter='py')
>>> cat(bin, 'py') # doctest: +NORMALIZE_WHITESPACE +ELLIPSIS
#!/usr/local/bin/python2.7
<BLANKLINE>
import sys
<BLANKLINE>
sys.path[0:0] = [
<BLANKLINE>
  ]
...

An additional argument can be passed to define which scripts to install and to provide script names. The argument is a dictionary mapping original script names to new script names.

>>> bin = tmpdir('bin2')
>>> scripts = zc.buildout.easy_install.scripts(
...     ['demo'], ws, sys.executable, bin, dict(demo='run'))
>>> if sys.platform == 'win32':
...     scripts == [os.path.join(bin, 'run.exe'),
...                 os.path.join(bin, 'run-script.py')]
... else:
...     scripts == [os.path.join(bin, 'run')]
True
>>> ls(bin)
-  run
>>> print_(system(os.path.join(bin, 'run')), end='')
3 1

The scripts that are generated are made executable:

>>> if sys.platform == 'win32':
...     os.access(os.path.join(bin, 'run.exe'), os.X_OK)
... else:
...     os.access(os.path.join(bin, 'run'), os.X_OK)
True

Including extra paths in scripts

We can pass a keyword argument, extra paths, to cause additional paths to be included in the a generated script:

>>> foo = tmpdir('foo')
>>> scripts = zc.buildout.easy_install.scripts(
...    ['demo'], ws, sys.executable, bin, dict(demo='run'),
...    extra_paths=[foo])
>>> cat(bin, 'run') # doctest: +NORMALIZE_WHITESPACE
#!/usr/local/bin/python2.7
<BLANKLINE>
import sys
sys.path[0:0] = [
  '/sample-install/demo-0.3-py2.4.egg',
  '/sample-install/demoneeded-1.1-py2.4.egg',
  '/foo',
  ]
<BLANKLINE>
import eggrecipedemo
<BLANKLINE>
if __name__ == '__main__':
    sys.exit(eggrecipedemo.main())

Providing script arguments

An “argument” keyword argument can be used to pass arguments to an entry point. The value passed is a source string to be placed between the parentheses in the call:

>>> scripts = zc.buildout.easy_install.scripts(
...    ['demo'], ws, sys.executable, bin, dict(demo='run'),
...    arguments='1, 2')
>>> cat(bin, 'run') # doctest: +NORMALIZE_WHITESPACE
#!/usr/local/bin/python2.7
import sys
sys.path[0:0] = [
  '/sample-install/demo-0.3-py2.4.egg',
  '/sample-install/demoneeded-1.1-py2.4.egg',
  ]
<BLANKLINE>
import eggrecipedemo
<BLANKLINE>
if __name__ == '__main__':
    sys.exit(eggrecipedemo.main(1, 2))

Passing initialization code

You can also pass script initialization code:

>>> scripts = zc.buildout.easy_install.scripts(
...    ['demo'], ws, sys.executable, bin, dict(demo='run'),
...    arguments='1, 2',
...    initialization='import os\nos.chdir("foo")',
...    interpreter='py')
>>> cat(bin, 'run') # doctest: +NORMALIZE_WHITESPACE
#!/usr/local/bin/python2.7
import sys
sys.path[0:0] = [
  '/sample-install/demo-0.3-py2.4.egg',
  '/sample-install/demoneeded-1.1-py2.4.egg',
  ]
<BLANKLINE>
import os
os.chdir("foo")
<BLANKLINE>
import eggrecipedemo
<BLANKLINE>
if __name__ == '__main__':
    sys.exit(eggrecipedemo.main(1, 2))

It will be included in interpreters too:

>>> cat(bin, 'py') # doctest: +NORMALIZE_WHITESPACE +ELLIPSIS
#!/usr/local/bin/python2.7
<BLANKLINE>
import sys
<BLANKLINE>
sys.path[0:0] = [
  '/sample-install/demo-0.3-py3.3.egg',
  '/sample-install/demoneeded-1.1-py3.3.egg',
  ]
<BLANKLINE>
import os
os.chdir("foo")
<BLANKLINE>
<BLANKLINE>
_interactive = True
...

Relative paths

Sometimes, you want to be able to move a buildout directory around and have scripts still work without having to rebuild them. We can control this using the relative_paths option to install. You need to pass a common base directory of the scripts and eggs:

>>> bo = tmpdir('bo')
>>> ba = tmpdir('ba')
>>> mkdir(bo, 'eggs')
>>> mkdir(bo, 'bin')
>>> mkdir(bo, 'other')
>>> ws = zc.buildout.easy_install.install(
...     ['demo'], join(bo, 'eggs'), links=[link_server],
...     index=link_server+'index/')
>>> scripts = zc.buildout.easy_install.scripts(
...    ['demo'], ws, sys.executable, join(bo, 'bin'), dict(demo='run'),
...    extra_paths=[ba, join(bo, 'bar')],
...    interpreter='py',
...    relative_paths=bo)
>>> cat(bo, 'bin', 'run')
#!/usr/local/bin/python2.7
<BLANKLINE>
import os
<BLANKLINE>
join = os.path.join
base = os.path.dirname(os.path.abspath(os.path.realpath(__file__)))
base = os.path.dirname(base)
<BLANKLINE>
import sys
sys.path[0:0] = [
  join(base, 'eggs/demo-0.3-pyN.N.egg'),
  join(base, 'eggs/demoneeded-1.1-pyN.N.egg'),
  '/ba',
  join(base, 'bar'),
  ]
<BLANKLINE>
import eggrecipedemo
<BLANKLINE>
if __name__ == '__main__':
    sys.exit(eggrecipedemo.main())

Note that the extra path we specified that was outside the directory passed as relative_paths wasn’t converted to a relative path.

Of course, running the script works:

>>> print_(system(join(bo, 'bin', 'run')), end='')
3 1

We specified an interpreter and its paths are adjusted too:

>>> cat(bo, 'bin', 'py') # doctest: +NORMALIZE_WHITESPACE +REPORT_NDIFF
#!/usr/local/bin/python2.7
<BLANKLINE>
import os
<BLANKLINE>
join = os.path.join
base = os.path.dirname(os.path.abspath(os.path.realpath(__file__)))
base = os.path.dirname(base)
<BLANKLINE>
import sys
<BLANKLINE>
sys.path[0:0] = [
  join(base, 'eggs/demo-0.3-pyN.N.egg'),
  join(base, 'eggs/demoneeded-1.1-pyN.N.egg'),
  '/ba',
  join(base, 'bar'),
  ]
<BLANKLINE>
_interactive = True
if len(sys.argv) > 1:
    _options, _args = __import__("getopt").getopt(sys.argv[1:], 'ic:m:')
    _interactive = False
    for (_opt, _val) in _options:
        if _opt == '-i':
            _interactive = True
        elif _opt == '-c':
            exec(_val)
        elif _opt == '-m':
            sys.argv[1:] = _args
            _args = []
            __import__("runpy").run_module(
                 _val, {}, "__main__", alter_sys=True)
<BLANKLINE>
    if _args:
        sys.argv[:] = _args
        __file__ = _args[0]
        del _options, _args
        with open(__file__, 'U') as __file__f:
            exec(compile(__file__f.read(), __file__, "exec"))
<BLANKLINE>
if _interactive:
    del _interactive
    __import__("code").interact(banner="", local=globals())

Installing distutils-style scripts

Most python libraries use the console_scripts entry point nowadays. But several still have a scripts=['bin/something'] in their setup() call. Buildout also installs those:

>>> distdir = tmpdir('distutilsscriptdir')
>>> distbin = tmpdir('distutilsscriptbin')
>>> ws = zc.buildout.easy_install.install(
...     ['other'], distdir,
...     links=[link_server], index=link_server+'index/')
>>> scripts = zc.buildout.easy_install.scripts(
...     ['other'], ws, sys.executable, distbin)
>>> ls(distbin)
-  distutilsscript

Like for console_scripts, the output is a list of the scripts generated. Likewise, on windows two files, an .exe and a script with -script.py appended, are generated:

>>> import os, sys
>>> if sys.platform == 'win32':
...     scripts == [os.path.join(distbin, 'distutilsscript.exe'),
...                 os.path.join(distbin, 'distutilsscript-script.py')]
... else:
...     scripts == [os.path.join(distbin, 'distutilsscript')]
True

It also works for zipped eggs:

>>> distdir2 = tmpdir('distutilsscriptdir2')
>>> distbin2 = tmpdir('distutilsscriptbin2')
>>> ws = zc.buildout.easy_install.install(
...     ['du_zipped'], distdir2,
...     links=[link_server], index=link_server+'index/')
>>> scripts = zc.buildout.easy_install.scripts(
...     ['du_zipped'], ws, sys.executable, distbin2)
>>> ls(distbin2)
-  distutilsscript

Distutils copies the script files verbatim, apart from a line at the top that looks like #!/usr/bin/python, which gets replaced by the actual python interpreter. Buildout does the same, but additionally also adds the sys.path like for the console_scripts.

>>> cat(distbin, 'distutilsscript')
#!/usr/local/bin/python2.7
# -*- coding: utf-8 -*-
"""Module docstring."""
from __future__ import print_statement
<BLANKLINE>
<BLANKLINE>
import sys
sys.path[0:0] = [
  '/distutilsscriptdir/other-1.0-pyN.N.egg',
  ]
<BLANKLINE>
<BLANKLINE>
import os
import sys; sys.stdout.write("distutils!\n")

Note that there are several items that need to come first in such a script before buildout’s sys.path statements: a source encoding hint, a module docstring and __future__ imports. Buildout retains them in their proper place by looking at the first non-future import and placing its sys.path statement before that.

Due to the nature of distutils scripts, buildout cannot pass arguments as there’s no specific method to pass them to.

In some cases, a python 3 __pycache__ directory can end up in an internal EGG-INFO metadata directory, next to the script information we’re looking for. Buildout doesn’t crash on that:

>>> eggname = [name for name in os.listdir(distdir2)
...            if name.endswith('egg')][0]
>>> scripts_metadata_dir = os.path.join(
...     distdir2, eggname, 'EGG-INFO', 'scripts')
>>> os.mkdir(os.path.join(scripts_metadata_dir, '__dummy__'))
>>> scripts = zc.buildout.easy_install.scripts(
...     ['du_zipped'], ws, sys.executable, distbin2)
>>> ls(distbin2)
-  distutilsscript

Handling custom build options for extensions provided in source distributions

Sometimes, we need to control how extension modules are built. The build function provides this level of control. It takes a single package specification, downloads a source distribution, and builds it with specified custom build options.

The build function takes 3 positional arguments:

spec

A package specification for a source distribution

dest

A destination directory

build_ext

A dictionary of options to be passed to the distutils build_ext command when building extensions.

It supports a number of optional keyword arguments:

links

a sequence of URLs, file names, or directories to look for links to distributions,

index

The URL of an index server, or almost any other valid URL. :)

If not specified, the Python Package Index, http://pypi.python.org/simple/, is used. You can specify an alternate index with this option. If you use the links option and if the links point to the needed distributions, then the index can be anything and will be largely ignored. In the examples, here, we’ll just point to an empty directory on our link server. This will make our examples run a little bit faster.

path

A list of additional directories to search for locally-installed distributions.

newest

A boolean value indicating whether to search for new distributions when already-installed distributions meet the requirement. When this is true, the default, and when the destination directory is not None, then the install function will search for the newest distributions that satisfy the requirements.

versions

A dictionary mapping project names to version numbers to be used when selecting distributions. This can be used to specify a set of distribution versions independent of other requirements.

Our link server included a source distribution that includes a simple extension, extdemo.c:

#include <Python.h>
#include <extdemo.h>

static PyMethodDef methods[] = {};

PyMODINIT_FUNC
initextdemo(void)
{
    PyObject *m;
    m = Py_InitModule3("extdemo", methods, "");
#ifdef TWO
    PyModule_AddObject(m, "val", PyInt_FromLong(2));
#else
    PyModule_AddObject(m, "val", PyInt_FromLong(EXTDEMO));
#endif
}

The extension depends on a system-dependent include file, extdemo.h, that defines a constant, EXTDEMO, that is exposed by the extension.

We’ll add an include directory to our sample buildout and add the needed include file to it:

>>> mkdir('include')
>>> write('include', 'extdemo.h',
... """
... #define EXTDEMO 42
... """)

Now, we can use the build function to create an egg from the source distribution:

>>> zc.buildout.easy_install.build(
...   'extdemo', dest,
...   {'include-dirs': os.path.join(sample_buildout, 'include')},
...   links=[link_server], index=link_server+'index/')
['/sample-install/extdemo-1.4-py2.4-unix-i686.egg']

The function returns the list of eggs

Now if we look in our destination directory, we see we have an extdemo egg:

>>> ls(dest)
d  demo-0.2-py2.4.egg
d  demo-0.3-py2.4.egg
d  demoneeded-1.0-py2.4.egg
d  demoneeded-1.1-py2.4.egg
d  extdemo-1.4-py2.4-unix-i686.egg

Let’s update our link server with a new version of extdemo:

>>> update_extdemo()
>>> print_(get(link_server), end='')
<html><body>
<a href="bigdemo-0.1-py2.4.egg">bigdemo-0.1-py2.4.egg</a><br>
<a href="demo-0.1-py2.4.egg">demo-0.1-py2.4.egg</a><br>
<a href="demo-0.2-py2.4.egg">demo-0.2-py2.4.egg</a><br>
<a href="demo-0.3-py2.4.egg">demo-0.3-py2.4.egg</a><br>
<a href="demo-0.4c1-py2.4.egg">demo-0.4c1-py2.4.egg</a><br>
<a href="demoneeded-1.0.zip">demoneeded-1.0.zip</a><br>
<a href="demoneeded-1.1.zip">demoneeded-1.1.zip</a><br>
<a href="demoneeded-1.2c1.zip">demoneeded-1.2c1.zip</a><br>
<a href="du_zipped-1.0-pyN.N.egg">du_zipped-1.0-pyN.N.egg</a><br>
<a href="extdemo-1.4.zip">extdemo-1.4.zip</a><br>
<a href="extdemo-1.5.zip">extdemo-1.5.zip</a><br>
<a href="index/">index/</a><br>
<a href="other-1.0-py2.4.egg">other-1.0-py2.4.egg</a><br>
</body></html>

The easy_install caches information about servers to reduce network access. To see the update, we have to call the clear_index_cache function to clear the index cache:

>>> zc.buildout.easy_install.clear_index_cache()

If we run build with newest set to False, we won’t get an update:

>>> zc.buildout.easy_install.build(
...   'extdemo', dest,
...   {'include-dirs': os.path.join(sample_buildout, 'include')},
...   links=[link_server], index=link_server+'index/',
...   newest=False)
['/sample-install/extdemo-1.4-py2.4-linux-i686.egg']
>>> ls(dest)
d  demo-0.2-py2.4.egg
d  demo-0.3-py2.4.egg
d  demoneeded-1.0-py2.4.egg
d  demoneeded-1.1-py2.4.egg
d  extdemo-1.4-py2.4-unix-i686.egg

But if we run it with the default True setting for newest, then we’ll get an updated egg:

>>> zc.buildout.easy_install.build(
...   'extdemo', dest,
...   {'include-dirs': os.path.join(sample_buildout, 'include')},
...   links=[link_server], index=link_server+'index/')
['/sample-install/extdemo-1.5-py2.4-unix-i686.egg']
>>> ls(dest)
d  demo-0.2-py2.4.egg
d  demo-0.3-py2.4.egg
d  demoneeded-1.0-py2.4.egg
d  demoneeded-1.1-py2.4.egg
d  extdemo-1.4-py2.4-unix-i686.egg
d  extdemo-1.5-py2.4-unix-i686.egg

The versions option also influences the versions used. For example, if we specify a version for extdemo, then that will be used, even though it isn’t the newest. Let’s clean out the destination directory first:

>>> import os
>>> for name in os.listdir(dest):
...     remove(dest, name)
>>> zc.buildout.easy_install.build(
...   'extdemo', dest,
...   {'include-dirs': os.path.join(sample_buildout, 'include')},
...   links=[link_server], index=link_server+'index/',
...   versions=dict(extdemo='1.4'))
['/sample-install/extdemo-1.4-py2.4-unix-i686.egg']
>>> ls(dest)
d  extdemo-1.4-py2.4-unix-i686.egg

Handling custom build options for extensions in develop eggs

The develop function is similar to the build function, except that, rather than building an egg from a source directory containing a setup.py script.

The develop function takes 2 positional arguments:

setup

The path to a setup script, typically named “setup.py”, or a directory containing a setup.py script.

dest

The directory to install the egg link to

It supports some optional keyword argument:

build_ext

A dictionary of options to be passed to the distutils build_ext command when building extensions.

We have a local directory containing the extdemo source:

>>> ls(extdemo)
-  MANIFEST
-  MANIFEST.in
-  README
-  extdemo.c
-  setup.py

Now, we can use the develop function to create a develop egg from the source distribution:

>>> zc.buildout.easy_install.develop(
...   extdemo, dest,
...   {'include-dirs': os.path.join(sample_buildout, 'include')})
'/sample-install/extdemo.egg-link'

The name of the egg link created is returned.

Now if we look in our destination directory, we see we have an extdemo egg link:

>>> ls(dest)
d  extdemo-1.4-py2.4-unix-i686.egg
-  extdemo.egg-link

And that the source directory contains the compiled extension:

>>> contents = os.listdir(extdemo)
>>> bool([f for f in contents if f.endswith('.so') or f.endswith('.pyd')])
True

Download cache

Normally, when distributions are installed, if any processing is needed, they are downloaded from the internet to a temporary directory and then installed from there. A download cache can be used to avoid the download step. This can be useful to reduce network access and to create source distributions of an entire buildout.

A download cache is specified by calling the download_cache function. The function always returns the previous setting. If no argument is passed, then the setting is unchanged. If an argument is passed, the download cache is set to the given path, which must point to an existing directory. Passing None clears the cache setting.

To see this work, we’ll create a directory and set it as the cache directory:

>>> cache = tmpdir('cache')
>>> zc.buildout.easy_install.download_cache(cache)

We’ll recreate our destination directory:

>>> remove(dest)
>>> dest = tmpdir('sample-install')

We’d like to see what is being fetched from the server, so we’ll enable server logging:

>>> _ = get(link_server+'enable_server_logging')
GET 200 /enable_server_logging

Now, if we install demo, and extdemo:

>>> ws = zc.buildout.easy_install.install(
...     ['demo==0.2'], dest,
...     links=[link_server], index=link_server+'index/')
GET 200 /
GET 404 /index/demo/
GET 200 /index/
GET 200 /demo-0.2-py2.4.egg
GET 404 /index/demoneeded/
GET 200 /demoneeded-1.1.zip
>>> zc.buildout.easy_install.build(
...   'extdemo', dest,
...   {'include-dirs': os.path.join(sample_buildout, 'include')},
...   links=[link_server], index=link_server+'index/')
GET 404 /index/extdemo/
GET 200 /extdemo-1.5.zip
['/sample-install/extdemo-1.5-py2.4-linux-i686.egg']

Not only will we get eggs in our destination directory:

>>> ls(dest)
d  demo-0.2-py2.4.egg
d  demoneeded-1.1-py2.4.egg
d  extdemo-1.5-py2.4-linux-i686.egg

But we’ll get distributions in the cache directory:

>>> ls(cache)
-  demo-0.2-py2.4.egg
-  demoneeded-1.1.zip
-  extdemo-1.5.zip

The cache directory contains uninstalled distributions, such as zipped eggs or source distributions.

Let’s recreate our destination directory and clear the index cache:

>>> remove(dest)
>>> dest = tmpdir('sample-install')
>>> zc.buildout.easy_install.clear_index_cache()

Now when we install the distributions:

>>> ws = zc.buildout.easy_install.install(
...     ['demo==0.2'], dest,
...     links=[link_server], index=link_server+'index/')
GET 200 /
GET 404 /index/demo/
GET 200 /index/
GET 404 /index/demoneeded/
>>> zc.buildout.easy_install.build(
...   'extdemo', dest,
...   {'include-dirs': os.path.join(sample_buildout, 'include')},
...   links=[link_server], index=link_server+'index/')
GET 404 /index/extdemo/
['/sample-install/extdemo-1.5-py2.4-linux-i686.egg']
>>> ls(dest)
d  demo-0.2-py2.4.egg
d  demoneeded-1.1-py2.4.egg
d  extdemo-1.5-py2.4-linux-i686.egg

Note that we didn’t download the distributions from the link server.

If we remove the restriction on demo, we’ll download a newer version from the link server:

>>> ws = zc.buildout.easy_install.install(
...     ['demo'], dest,
...     links=[link_server], index=link_server+'index/')
GET 200 /demo-0.3-py2.4.egg

Normally, the download cache is the preferred source of downloads, but not the only one.

Installing solely from a download cache

A download cache can be used as the basis of application source releases. In an application source release, we want to distribute an application that can be built without making any network accesses. In this case, we distribute a download cache and tell the easy_install module to install from the download cache only, without making network accesses. The install_from_cache function can be used to signal that packages should be installed only from the download cache. The function always returns the previous setting. Calling it with no arguments returns the current setting without changing it:

>>> zc.buildout.easy_install.install_from_cache()
False

Calling it with a boolean value changes the setting and returns the previous setting:

>>> zc.buildout.easy_install.install_from_cache(True)
False

Let’s remove demo-0.3-py2.4.egg from the cache, clear the index cache, recreate the destination directory, and reinstall demo:

>>> for  f in os.listdir(cache):
...     if f.startswith('demo-0.3-'):
...         remove(cache, f)
>>> zc.buildout.easy_install.clear_index_cache()
>>> remove(dest)
>>> dest = tmpdir('sample-install')
>>> ws = zc.buildout.easy_install.install(
...     ['demo'], dest,
...     links=[link_server], index=link_server+'index/')
>>> ls(dest)
d  demo-0.2-py2.4.egg
d  demoneeded-1.1-py2.4.egg

This time, we didn’t download from or even query the link server.

Change History

2.2.2 (2014-10-30)

  • Open files for exec() in universal newlines mode. See https://github.com/buildout/buildout/issues/130

  • Add BUILDOUT_HOME as an alternate way to control how the user default configuration is found.

  • Close various files when finished writing to them. This avoids ResourceWarnings on Python 3, and better supports doctests under PyPy.

  • Introduce improved easy_install Install.install function. This is present in 1.5.X and 1.7X but was never merged into 2.X somehow.

2.2.1 (2013-09-05)

  • distutils scripts: correct order of operations on from ... import lines (see https://github.com/buildout/buildout/issues/134).

  • Add an --allow-site-packges option to bootstrap.py, defaulting to False. If the value is false, strip any “site packages” (as defined by the site module) from sys.path before attempting to import setuptools / pkg_resources.

  • Updated the URL used to fetch ez_setup.py to the official, non-version- pinned version.

2.2.0 (2013-07-05)

  • Handle both addition and subtraction of elements (+= and -=) on the same key in the same section. Forward-ported from buildout 1.6.

  • Suppress the useless Link to <URL> ***BLOCKED*** by --allow-hosts error message being emitted by distribute / setuptools.

  • Extend distutils script generation to support module docstrings and __future__ imports.

  • Refactored picked versions logic to make it easier to use for plugins.

  • Use get_win_launcher API to find Windows launcher (falling back to resource_string for cli.exe).

  • Remove data_files from setup.py: it was installing README.txt in current directory during installation (merged from 1.x branch).

  • Switch dependency from distribute 0.6.x to setuptools 0.7.x.

2.1.0 (2013-03-23)

Fixed: Builout didn’t exit with a non-zero exit status if there was a

failure in combination with an upgrade.

Fixed: We now fail with an informative error when an old bootstrap

script causes buildout 2 to be used with setuptools.

Fixed: An error incorrectly suggested that buildout 2 implemented all

of the functionality of dumppickedversions.

Fixed: Buildout generated bad scripts when no eggs needed to be added

to sys.path.

Fixed: Buildout didn’t honour Unix umask when generating scripts.

https://bugs.launchpad.net/zc.buildout/+bug/180705

Fixed: update-versions-file didn’t work unless

show-picked-versions was also set. https://github.com/buildout/buildout/issues/71

2.0.1 (2013-02-16)

  • Fixed: buildout didn’t honor umask settings when creating scripts.

  • Fix for distutils scripts installation on Python 3, related to __pycache__ directories.

  • Fixed: encoding data in non-entry-point-based scripts was lost.

2.0.0 (2013-02-10)

This is a backward incompatible release of buildout that attempts to correct mistakes made in buildout 1.

  • Buildout no-longer tries to provide full or partial isolation from system Python installations. If you want isolation, use buildout with virtualenv, or use a clean build of Python to begin with.

    Providing isolation was a noble goal, but it’s implementation complicated buildout’s implementation too much.

  • Buildout no-longer supports using multiple versions of Python in a single buildout. This too was a noble goal, but added too much complexity to the implementation.

  • Changed the configuration file format:

    • Relative indentation in option values is retained if the first line is blank. (IOW, if the non-blank text is on the continuation lines.) As in:

      [mysection]
      tree =
        /root
          branch

      In such cases, internal blank lines are also retained.

    • The configuration syntax is more tightly defined, allowing fewer syntax definitions.

      Buildout 1 configuration files were parsed with the Python ConfigParser module. The ConfigParser module’s format is poorly documented and wildly flexible. For example:

      • Any characters other than left square brackets were allowed in section names.

      • Arbitrary text was allowed and ignored after the closing bracket on section header lines.

      • Any characters other than equal signs or colons were allowed in an option name.

      • Configuration options could be spelled as RFC 822 mail headers (using a colon, rather than an equal sign).

      • Comments could begin with “rem”.

      • Semicolons could be used to start inline comments, but only if preceded by a whitespace character.

    See Configuration file syntax.

  • Buildout now prefers final releases by default (buildout:prefer-final now defaults to true, rather than false.)

    However, if buildout is bootstrapped with a non-final release, it won’t downgrade itself to a final release.

  • Buildout no-longer installs zipped eggs. (Distribute may still install a zipped egg of itself during the bootstrapping process.) The buildout:unzip option has been removed.

  • Buildout no-longer supports setuptools. It now uses distribute exclusively.

  • Integrated the buildout-versions extension into buildout itself. For this, a few options were added to buildout:

    • If show-picked-versions is set to true, all picked versions are printed at the end of the buildout run. This saves you from running buildout in verbose mode and extracting the picked versions from the output.

    • If update-versions-file is set to a filename (relative to the buildout directory), the show-picked-versions output is appended to that file.

  • Buildout options can be given on the command line using the form:

    option_name=value

    as a short-hand for:

    buildout:option_name=value
  • The versions option now defaults to versions, so you no longer need to include:

    versions = versions

    in a buildout section when pinning versions.

    A versions section is added, if necessary, if a versions option isn’t used.

  • Buildout-defined default versions are included in the versions section, if there is one.

  • The buildout:zc.buildout-version and buildout:distribute-version options have been removed in favor of providing version constraints in a versions section.

  • Error if install-from-cache and offline are used together, because offline largely means “don’t install”.

  • Provide better error messages when distributions can’t be installed because buildout is run in offline mode.

  • Versions in versions sections can now be simple constraints, like >=2.0dev in addition to being simple versions.

    Buildout 2 leverages this to make sure it uses zc.recipe.egg>=2.0.0a3, which mainly matters for Python 3.

  • The buildout init command now accepts distribution requirements and paths to set up a custom interpreter part that has the distributions or parts in the path. For example:

    python bootstrap.py init BeautifulSoup
  • Added buildout:socket-timeout option so that socket timeout can be configured both from command line and from config files. (gotcha)

  • Distutils-style scripts are also installed now (for instance pyflakes’ and docutils’ scripts). https://bugs.launchpad.net/zc.buildout/+bug/422724

  • Avoid sorting the working set and requirements when it won’t be logged. When profiling a simple buildout with 10 parts with identical and large working sets, this resulted in a decrease of run time from 93.411 to 15.068 seconds, about a 6 fold improvement. To see the benefit be sure to run without any increase in verbosity (“-v” option). (rossp)

  • Introduce a cache for the expensive buildout._dir_hash function.

  • Remove duplicate path from script’s sys.path setup.

  • Make sure to download extended configuration files only once per buildout run even if they are referenced multiple times (patch by Rafael Monnerat).

  • Removed any traces of the implementation of extended-by. Raise a UserError if the option is encountered instead of ignoring it, though.

Fixed: relative-paths weren’t honored when bootstrapping or upgrading

(which is how the buildout script gets generated).

Fixed: initialization code wasn’t included in interpreter scripts.

Fixed: macro inheritance bug, https://github.com/buildout/buildout/pull/37

Fixed: In the download module, fixed the handling of directories that

are pointed to by file-system paths and file: URLs.

Fixed if you have a configuration with an extends entry in the [buildout]

section which points to a non-existing URL the result is not very user friendly. https://bugs.launchpad.net/zc.buildout/+bug/566167

Fixed: https://bugs.launchpad.net/bugs/697913Buildout doesn’t honor exit code

from scripts. Fixed.

1.4.4 (2010-08-20)

The 1.4.4 release is a release for people who encounter trouble with the 1.5 line. By switching to the associated bootstrap script you can stay on 1.4.4 until you are ready to migrate.

1.4.3 (2009-12-10)

Bugs fixed:

  • Using pre-detected setuptools version for easy_installing tgz files. This prevents a recursion error when easy_installing an upgraded “distribute” tgz. Note that setuptools did not have this recursion problem solely because it was packaged as an .egg, which does not have to go through the easy_install step.

1.4.2 (2009-11-01)

New Feature:

  • Added a –distribute option to the bootstrap script, in order to use Distribute rather than Setuptools. By default, Setuptools is used.

Bugs fixed:

  • While checking for new versions of setuptools and buildout itself, compare requirement locations instead of requirement objects.

  • Incrementing didn’t work properly when extending multiple files. https://bugs.launchpad.net/zc.buildout/+bug/421022

  • The download API computed MD5 checksums of text files wrong on Windows.

1.4.1 (2009-08-27)

New Feature:

  • Added a debug built-in recipe to make writing some tests easier.

Bugs fixed:

  • (introduced in 1.4.0) option incrementing (-=) and decrementing (-=) didn’t work in the buildout section. https://bugs.launchpad.net/zc.buildout/+bug/420463

  • Option incrementing and decrementing didn’t work for options specified on the command line.

  • Scripts generated with relative-paths enabled couldn’t be symbolically linked to other locations and still work.

  • Scripts run using generated interpreters didn’t have __file__ set correctly.

  • The standard Python -m option didn’t work for custom interpreters.

1.4.0 (2009-08-26)

  • When doing variable substitutions, you can omit the section name to refer to a variable in the same section (e.g. ${:foo}).

  • When doing variable substitution, you can use the special option, _buildout_section_name_ to get the section name. This is most handy for getting the current section name (e.g. ${:_buildout_section_name_}).

  • A new special option, < allows sections to be used as macros.

  • Added annotate command for annotated sections. Displays sections key-value pairs along with the value origin.

  • Added a download API that handles the download cache, offline mode etc and is meant to be reused by recipes.

  • Used the download API to allow caching of base configurations (specified by the buildout section’s ‘extends’ option).

1.3.1 (2009-08-12)

  • Bug fixed: extras were ignored in some cases when versions were specified.

1.3.0 (2009-06-22)

  • Better Windows compatibility in test infrastructure.

  • Now the bootstrap.py has an optional –version argument, that can be used to force buildout version to use.

  • zc.buildout.testing.buildoutSetUp installs a new handler in the python root logging facility. This handler is now removed during tear down as it might disturb other packages reusing buildout’s testing infrastructure.

  • fixed usage of ‘relative_paths’ keyword parameter on Windows

  • Added an unload entry point for extensions.

  • Fixed bug: when the relative paths option was used, relative paths could be inserted into sys.path if a relative path was used to run the generated script.

1.2.1 (2009-03-18)

  • Refactored generation of relative egg paths to generate simpler code.

1.2.0 (2009-03-17)

  • Added a relative_paths option to zc.buildout.easy_install.script to generate egg paths relative to the script they’re used in.

1.1.2 (2009-03-16)

  • Added Python 2.6 support. Removed Python 2.3 support.

  • Fixed remaining deprecation warnings under Python 2.6, both when running our tests and when using the package.

  • Switched from using os.popen* to subprocess.Popen, to avoid a deprecation warning in Python 2.6. See:

    http://docs.python.org/library/subprocess.html#replacing-os-popen-os-popen2-os-popen3

  • Made sure the ‘redo_pyc’ function and the doctest checkers work with Python executable paths containing spaces.

  • Expand shell patterns when processing the list of paths in develop, e.g:

    [buildout]
    develop = ./local-checkouts/*
  • Conditionally import and use hashlib.md5 when it’s available instead of md5 module, which is deprecated in Python 2.6.

  • Added Jython support for bootstrap, development bootstrap and buildout support on Jython

  • Fixed a bug that would cause buildout to break while computing a directory hash if it found a broken symlink (Launchpad #250573)

1.1.1 (2008-07-28)

  • Fixed a bug that caused buildouts to fail when variable substitutions are used to name standard directories, as in:

    [buildout]
    eggs-directory = ${buildout:directory}/develop-eggs

1.1.0 (2008-07-19)

  • Added a buildout-level unzip option to change the default policy for unzipping zip-safe eggs.

  • Tracebacks are now printed for internal errors (as opposed to user errors) even without the -D option.

  • pyc and pyo files are regenerated for installed eggs so that the stored path in code objects matches the install location.

1.0.6 (2008-06-13)

1.0.5 (2008-06-10)

  • Fixed wrong split when using the += and -= syntax (mustapha)

1.0.4 (2008-06-10)

  • Added the allow-hosts option (tarek)

  • Quote the ‘executable’ argument when trying to detect the python version using popen4. (sidnei)

  • Quote the ‘spec’ argument, as in the case of installing an egg from the buildout-cache, if the filename contains spaces it would fail (sidnei)

  • Extended configuration syntax to allow -= and += operators (malthe, mustapha).

1.0.3 (2008-06-01)

  • fix for “””AttributeError: Buildout instance has no attribute ‘_logger’””” by providing reasonable defaults within the Buildout constructor. (patch by Gottfried Ganssauge) (ajung)

1.0.2 (2008-05-13)

  • More fixes for Windows. A quoted sh-bang is now used on Windows to make the .exe files work with a Python executable in ‘program files’.

  • Added “-t <timeout_in_seconds>” option for specifying the socket timeout. (ajung)

1.0.1 (2008-04-02)

  • Made easy_install.py’s _get_version accept non-final releases of Python, like 2.4.4c0. (hannosch)

  • Applied various patches for Windows (patch by Gottfried Ganssauge). (ajung)

  • Applied patch fixing rmtree issues on Windows (patch by Gottfried Ganssauge). (ajung)

1.0.0 (2008-01-13)

  • Added a French translation of the buildout tutorial.

1.0.0b31 (2007-11-01)

Feature Changes

  • Added a configuration option that allows buildouts to ignore dependency_links metadata specified in setup. By default dependency_links in setup are used in addition to buildout specified find-links. This can make it hard to control where eggs come from. Here’s how to tell buildout to ignore URLs in dependency_links:

    [buildout]
    use-dependency-links = false

    By default use-dependency-links is true, which matches the behavior of previous versions of buildout.

  • Added a configuration option that causes buildout to error if a version is picked. This is a nice safety belt when fixing all versions is intended, especially when creating releases.

Bugs Fixed

  • 151820: Develop failed if the setup.py script imported modules in the distribution directory.

  • Verbose logging of the develop command was omitting detailed output.

  • The setup command wasn’t documented.

  • The setup command failed if run in a directory without specifying a configuration file.

  • The setup command raised a stupid exception if run without arguments.

  • When using a local find links or index, distributions weren’t copied to the download cache.

  • When installing from source releases, a version specification (via a buildout versions section) for setuptools was ignored when deciding which setuptools to use to build an egg from the source release.

1.0.0b30 (2007-08-20)

Feature Changes

  • Changed the default policy back to what it was to avoid breakage in existing buildouts. Use:

    [buildout]
    prefer-final = true

    to get the new policy. The new policy will go into effect in buildout 2.

1.0.0b29 (2007-08-20)

Feature Changes

  • Now, final distributions are preferred over non-final versions. If both final and non-final versions satisfy a requirement, then the final version will be used even if it is older. The normal way to override this for specific packages is to specifically require a non-final version, either specifically or via a lower bound.

  • There is a buildout prefer-final version that can be used with a value of “false”:

    prefer-final = false

    To prefer newer versions, regardless of whether or not they are final, buildout-wide.

  • The new simple Python index, http://cheeseshop.python.org/simple, is used as the default index. This will provide better performance than the human package index interface, http://pypi.python.org/pypi. More importantly, it lists hidden distributions, so buildouts with fixed distribution versions will be able to find old distributions even if the distributions have been hidden in the human PyPI interface.

Bugs Fixed

  • 126441: Look for default.cfg in the right place on Windows.

1.0.0b28 (2007-07-05)

Bugs Fixed

  • When requiring a specific version, buildout looked for new versions even if that single version was already installed.

1.0.0b27 (2007-06-20)

Bugs Fixed

  • Scripts were generated incorrectly on Windows. This included the buildout script itself, making buildout completely unusable.

1.0.0b26 (2007-06-19)

Feature Changes

  • Thanks to recent fixes in setuptools, I was able to change buildout to use find-link and index information when searching extensions.

    Sadly, this work, especially the timing, was motivated my the need to use alternate indexes due to performance problems in the cheese shop (http://www.python.org/pypi/). I really home we can address these performance problems soon.

1.0.0b25 (2007-05-31)

Feature Changes

  • buildout now changes to the buildout directory before running recipe install and update methods.

  • Added a new init command for creating a new buildout. This creates an empty configuration file and then bootstraps.

  • Except when using the new init command, it is now an error to run buildout without a configuration file.

  • In verbose mode, when adding distributions to fulfil requirements of already-added distributions, we now show why the new distributions are being added.

  • Changed the logging format to exclude the logger name for the buildout logger. This reduces noise in the output.

  • Clean up lots of messages, adding missing periods and adding quotes around requirement strings and file paths.

Bugs Fixed

  • 114614: Buildouts could take a very long time if there were dependency problems in large sets of pathologically interdependent packages.

  • 59270: Buggy recipes can cause failures in later recipes via chdir

  • 61890: file:// urls don’t seem to work in find-links

    setuptools requires that file urls that point to directories must end in a “/”. Added a workaround.

  • 75607: buildout should not run if it creates an empty buildout.cfg

1.0.0b24 (2007-05-09)

Feature Changes

  • Improved error reporting by showing which packages require other packages that can’t be found or that cause version conflicts.

  • Added an API for use by recipe writers to clean up created files when recipe errors occur.

  • Log installed scripts.

Bugs Fixed

  • 92891: bootstrap crashes with recipe option in buildout section.

  • 113085: Buildout exited with a zero exist status when internal errors occurred.

1.0.0b23 (2007-03-19)

Feature Changes

  • Added support for download caches. A buildout can specify a cache for distribution downloads. The cache can be shared among buildouts to reduce network access and to support creating source distributions for applications allowing install without network access.

  • Log scripts created, as suggested in: https://bugs.launchpad.net/zc.buildout/+bug/71353

Bugs Fixed

  • It wasn’t possible to give options on the command line for sections not defined in a configuration file.

1.0.0b22 (2007-03-15)

Feature Changes

  • Improved error reporting and debugging support:

    • Added “logical tracebacks” that show functionally what the buildout was doing when an error occurs. Don’t show a Python traceback unless the -D option is used.

    • Added a -D option that causes the buildout to print a traceback and start the pdb post-mortem debugger when an error occurs.

    • Warnings are printed for unused options in the buildout section and installed-part sections. This should make it easier to catch option misspellings.

  • Changed the way the installed database (.installed.cfg) is handled to avoid database corruption when a user breaks out of a buildout with control-c.

  • Don’t save an installed database if there are no installed parts or develop egg links.

1.0.0b21 (2007-03-06)

Feature Changes

  • Added support for repeatable buildouts by allowing egg versions to be specified in a versions section.

  • The easy_install module install and build functions now accept a versions argument that supplied to mapping from project name to version numbers. This can be used to fix version numbers for required distributions and their dependencies.

    When a version isn’t fixed, using either a versions option or using a fixed version number in a requirement, then a debug log message is emitted indicating the version picked. This is useful for setting versions options.

    A default_versions function can be used to set a default value for this option.

  • Adjusted the output for verbosity levels. Using a single -v option no longer causes voluminous setuptools output. Using -vv and -vvv now triggers extra setuptools output.

  • Added a remove testing helper function that removes files or directories.

1.0.0b20 (2007-02-08)

Feature Changes

  • Added a buildout newest option, to control whether the newest distributions should be sought to meet requirements. This might also provide a hint to recipes that don’t deal with distributions. For example, a recipe that manages subversion checkouts might not update a checkout if newest is set to “false”.

  • Added a newest keyword parameter to the zc.buildout.easy_install.install and zc.buildout.easy_install.build functions to control whether the newest distributions that need given requirements should be sought. If a false value is provided for this parameter and already installed eggs meet the given requirements, then no attempt will be made to search for newer distributions.

  • The recipe-testing support setUp function now adds the name buildout to the test namespace with a value that is the path to the buildout script in the sample buildout. This allows tests to use

    >>> print system(buildout),
    

    rather than:

    >>> print system(join('bin', 'buildout')),
    

Bugs Fixed

  • Paths returned from update methods replaced lists of installed files rather than augmenting them.

1.0.0b19 (2007-01-24)

Bugs Fixed

  • Explicitly specifying a Python executable failed if the output of running Python with the -V option included a 2-digit (rather than a 3-digit) version number.

1.0.0b18 (2007-01-22)

Feature Changes

  • Added documentation for some previously undocumented features of the easy_install APIs.

  • By popular demand, added a -o command-line option that is a short hand for the assignment buildout:offline=true.

Bugs Fixed

  • When deciding whether recipe develop eggs had changed, buildout incorrectly considered files in .svn and CVS directories.

1.0.0b17 (2006-12-07)

Feature Changes

  • Configuration files can now be loaded from URLs.

Bugs Fixed

1.0.0b16 (2006-12-07)

Feature Changes

  • A new command-line argument, -U, suppresses reading user defaults.

  • You can now suppress use of an installed-part database (e.g. .installed.cfg) by specifying an empty value for the buildout installed option.

Bugs Fixed

  • When the install command is used with a list of parts, only those parts are supposed to be installed, but the buildout was also building parts that those parts depended on.

1.0.0b15 (2006-12-06)

Bugs Fixed

  • Uninstall recipes weren’t loaded correctly in cases where no parts in the (new) configuration used the recipe egg.

1.0.0b14 (2006-12-05)

Feature Changes

  • Added uninstall recipes for dealing with complex uninstallation scenarios.

Bugs Fixed

  • Automatic upgrades weren’t performed on Windows due to a bug that caused buildout to incorrectly determine that it wasn’t running locally in a buildout.

  • Fixed some spurious test failures on Windows.

1.0.0b13 (2006-12-04)

Feature Changes

  • Variable substitutions now reflect option data written by recipes.

  • A part referenced by a part in a parts list is now added to the parts list before the referencing part. This means that you can omit parts from the parts list if they are referenced by other parts.

  • Added a develop function to the easy_install module to aid in creating develop eggs with custom build_ext options.

  • The build and develop functions in the easy_install module now return the path of the egg or egg link created.

  • Removed the limitation that parts named in the install command can only name configured parts.

  • Removed support ConfigParser-style variable substitutions (e.g. %(foo)s). Only the string-template style of variable (e.g. ${section:option}) substitutions will be supported. Supporting both violates “there’s only one way to do it”.

  • Deprecated the buildout-section extendedBy option.

Bugs Fixed

  • We treat setuptools as a dependency of any distribution that (declares that it) uses namespace packages, whether it declares setuptools as a dependency or not. This wasn’t working for eggs installed by virtue of being dependencies.

1.0.0b12 (2006-10-24)

Feature Changes

  • Added an initialization argument to the zc.buildout.easy_install.scripts function to include initialization code in generated scripts.

1.0.0b11 (2006-10-24)

Bugs Fixed

67737

Verbose and quite output options caused errors when the develop buildout option was used to create develop eggs.

67871

Installation failed if the source was a (local) unzipped egg.

67873

There was an error in producing an error message when part names passed to the install command weren’t included in the configuration.

1.0.0b10 (2006-10-16)

Feature Changes

  • Renamed the runsetup command to setup. (The old name still works.)

  • Added a recipe update method. Now install is only called when a part is installed for the first time, or after an uninstall. Otherwise, update is called. For backward compatibility, recipes that don’t define update methods are still supported.

  • If a distribution defines namespace packages but fails to declare setuptools as one of its dependencies, we now treat setuptools as an implicit dependency. We generate a warning if the distribution is a develop egg.

  • You can now create develop eggs for setup scripts that don’t use setuptools.

Bugs Fixed

  • Egg links weren’t removed when corresponding entries were removed from develop sections.

  • Running a non-local buildout command (one not installed in the buildout) led to a hang if there were new versions of buildout or setuptools were available. Now we issue a warning and don’t upgrade.

  • When installing zip-safe eggs from local directories, the eggs were moved, rather than copied, removing them from the source directory.

1.0.0b9 (2006-10-02)

Bugs Fixed

Non-zip-safe eggs were not unzipped when they were installed.

1.0.0b8 (2006-10-01)

Bugs Fixed

  • Installing source distributions failed when using alternate Python versions (depending on the versions of Python used.)

  • Installing eggs wasn’t handled as efficiently as possible due to a bug in egg URL parsing.

  • Fixed a bug in runsetup that caused setup scripts that introspected __file__ to fail.

1.0.0b7

Added a documented testing framework for use by recipes. Refactored the buildout tests to use it.

Added a runsetup command run a setup script. This is handy if, like me, you don’t install setuptools in your system Python.

1.0.0b6

Fixed https://launchpad.net/products/zc.buildout/+bug/60582 Use of extension options caused bootstrapping to fail if the eggs directory didn’t already exist. We no longer use extensions for bootstrapping. There really isn’t any reason to anyway.

1.0.0b5

Refactored to do more work in buildout and less work in easy_install. This makes things go a little faster, makes errors a little easier to handle, and allows extensions (like the sftp extension) to influence more of the process. This was done to fix a problem in using the sftp support.

1.0.0b4

  • Added an experimental extensions mechanism, mainly to support adding sftp support to buildouts that need it.

  • Fixed buildout self-updating on Windows.

1.0.0b3

  • Added a help option (-h, –help)

  • Increased the default level of verbosity.

  • Buildouts now automatically update themselves to new versions of buildout and setuptools.

  • Added Windows support.

  • Added a recipe API for generating user errors.

  • No-longer generate a py_zc.buildout script.

  • Fixed some bugs in variable substitutions.

    The characters “-”, “.” and “ “, weren’t allowed in section or option names.

    Substitutions with invalid names were ignored, which caused misleading failures downstream.

  • Improved error handling. No longer show tracebacks for user errors.

  • Now require a recipe option (and therefore a section) for every part.

  • Expanded the easy_install module API to:

    • Allow extra paths to be provided

    • Specify explicit entry points

    • Specify entry-point arguments

1.0.0b2

Added support for specifying some build_ext options when installing eggs from source distributions.

1.0.0b1

  • Changed the bootstrapping code to only install setuptools and buildout. The bootstrap code no-longer runs the buildout itself. This was to fix a bug that caused parts to be recreated unnecessarily because the recipe signature in the initial buildout reflected temporary locations for setuptools and buildout.

  • Now create a minimal setup.py if it doesn’t exist and issue a warning that it is being created.

  • Fixed bug in saving installed configuration data. %’s and extra spaces weren’t quoted.

1.0.0a1

Initial public version

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

zc.buildout-2.2.2.tar.gz (290.7 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page