Per-task arguments

The options given in Command-line options apply to the invocation of fab as a whole; even if the order is mixed around, options still apply to all given tasks equally. Additionally, since tasks are just Python functions, it’s often desirable to pass in arguments to them at runtime.

Answering both these needs is the concept of “per-task arguments”, which is a special syntax you can tack onto the end of any task name:

  • Use a colon (:) to separate the task name from its arguments;
  • Use commas (,) to separate arguments from one another (may be escaped by using a backslash, i.e. \,);
  • Use equals signs (=) for keyword arguments, or omit them for positional arguments;

Additionally, since this process involves string parsing, all values will end up as Python strings, so plan accordingly. (We hope to improve upon this in future versions of Fabric, provided an intuitive syntax can be found.)

For example, a “create a new user” task might be defined like so (omitting most of the actual logic for brevity):

def new_user(username, admin='no', comment="No comment provided"):
    log_action("New User (%s): %s" % (username, comment))

You can specify just the username:

$ fab new_user:myusername

Or treat it as an explicit keyword argument:

$ fab new_user:username=myusername

If both args are given, you can again give them as positional args:

$ fab new_user:myusername,yes

Or mix and match, just like in Python:

$ fab new_user:myusername,admin=yes

The log_action call above is useful for illustrating escaped commas, like so:

$ fab new_user:myusername,admin=no,comment='Gary\, new developer (starts Monday)'



Quoting the backslash-escaped comma is required, as not doing so will cause shell syntax errors. Quotes are also needed whenever an argument involves other shell-related characters such as spaces.

All of the above are translated into the expected Python function calls. For example, the last call above would become:

>>> new_user('myusername', admin='yes', comment='Gary, new developer (starts Monday)')











Lazy connections

Because connections are driven by the individual operations, Fabric will not actually make connections until they’re necessary. Take for example this task which does some local housekeeping prior to interacting with the remote server:

from fabric.api import *

def clean_and_upload():
    local('find assets/ -name "*.DS_Store" -exec rm '{}' \;')
    local('tar czf /tmp/assets.tgz assets/')
    put('/tmp/assets.tgz', '/tmp/assets.tgz')
    with cd('/var/www/myapp/'):
        run('tar xzf /tmp/assets.tgz')

What happens, connection-wise, is as follows:

  1. The two local calls will run without making any network connections whatsoever;
  2. put asks the connection cache for a connection to host1;
  3. The connection cache fails to find an existing connection for that host string, and so creates a new SSH connection, returning it to put;
  4. put uploads the file through that connection;
  5. Finally, the run call asks the cache for a connection to that same host string, and is given the existing, cached connection for its own use.

Extrapolating from this, you can also see that tasks which don’t use any network-borne operations will never actually initiate any connections (though they will still be run once for each host in their host list, if any.)






def staging():

    "Pushes current code to staging, hups Apache"

    # get the build number    

    local('svn up mysite.com')


    config.svn_version   = svn_get_version()


    if not config.svn_version:



    config.static_path   = '/var/www/static.mysite.com'

    config.svn_path      = 'http://svn.mysite.com/trunk'

    config.svn_export    = 'svn export -q -r %(svn_version)s'


    run('mkdir %(path)s', fail='abort')


    # svn export mysite.com to path 

    run('%(svn_export)s %(svn_path)s/mysite.com %(path)s/mysite.com', fail='abort')


    # svn export site-packages to site-packages

    run('%(svn_export)s %(svn_path)s/site-packages %(path)s/site-packages', fail='abort')


    # svn export mysite.com to path 

    run('%(svn_export)s %(svn_path)s/scripts %(path)s/scripts', fail='warn')


    # svn export configs

    run('%(svn_export)s %(svn_path)s/config %(path)s/config', fail='abort')


    # export /var/www/static.mysite.com/releases/%(svn_version) 

    run('%(svn_export)s %(svn_path)s/static %(path)s/static', fail='abort')


    # symlink to images from /var/www/static.mysite.com/staging/images/menuitems/* new release dir

    run("rm -r %(path)s/static/images/menuitems", fail=abort)

    run("ln -s %(static_path)s/menuitems_staging %(path)s/static/images/menuitems", fail=abort)


    # rotate "staging" symlinks

    run('rm %(releases_path)s/staging.rollback', fail='warn')

    run('mv %(releases_path)s/staging  %(releases_path)s/staging.rollback', fail='warn')


    # staging sym to new destination

    run('ln -s %(path)s %(releases_path)s/staging', fail='abort')


    # server is hup'd



def rm_cur_rev():

    config.svn_version   = svn_get_version()

    run('rm -rf %(path)s', fail='abort')


def hup():

    sudo('/etc/init.d/apache2 restart')

    sudo('/etc/init.d/nginx restart')



def svn_get_version():

    from subprocess import Popen, PIPE

    output = Popen(["svn", "info", "mysite.com"], stdout=PIPE).communicate()[0]

    return output.partition('Revision: ')[2].partition('\n')[0]




config.fab_hosts = ['mysite.com']

config.fab_user = 'builder'

config.releases_path = '/var/www_apps/mysite.com'

config.path          = '%(releases_path)s/releases/$(svn_version)'













           Fabric, Django, Git, Apache, mod_wsgi, virtualenv and pip deployment

I’ve been playing with automating Django deployments again, this time using Fabric. I found a number of examples on the web but non of them quite fit the bill for me. I don’t like serving directly from a repository, I like to have either a package or tar I can use to say “that is what went to the server”. I also like having a quick rollback command as well as being able to deploy a particular version of the code when the need arises. I also wanted to go from a clean ubuntu install (plus SSH) to a running Django application in one command from the local development machine. The Apache side of things is nicely documented in this Gist which made a good starting point.

I’m still missing a few things in this setup mind and at the moment you still have to setup your local machine yourself. I’m probably going to create a paster template and another fabfile to do that I think. The instructions are a little rough as well at the moment and I’ve left the database out of it as everyone has there own preference.

This particular fabric file makes setting up and deploying a django application much easier, but it does make a few assumptions. Namely that you’re using Git, Apache and mod_wsgi and your using Debian or Ubuntu. Also you should have Django installed on your local machine and SSH installed on both the local machine and any servers you want to deploy to.

note that I’ve used the name project_name throughout this example. Replace this with whatever your project is called.

First step is to create your project locally:

mkdir project_name
cd project_name
django-admin.py startproject project_name

Now add a requirements file so pip knows to install Django. You’ll probably add other required modules in here later. Creat a file called requirements.txt and save it at the top level with the following contents:


Then save this fabfile.py file in the top level directory which should give you:


You’ll need a WSGI file called project_name.wsgi, where project_name is the name you gave to your django project. It will probably look like the following, depending on your specific paths and the location of your settings module

import os
import sys
# put the Django project on sys.path
sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), "../")))
os.environ["DJANGO_SETTINGS_MODULE"] = "project_name.settings"
from django.core.handlers.wsgi import WSGIHandler
application = WSGIHandler()

Last but not least you’ll want a virtualhost file for apache which looks something like the following. Save this as project_name in the inner directory. You’ll want to change /path/to/project_name/ to the location on the remote server you intent to deploy to.

<VirtualHost *:80>
        WSGIDaemonProcess project_name-production user=project_name group=project_name threads=10 python-path=/path/to/project_name/lib/python2.6/site-packages
        WSGIProcessGroup project_name-production
        WSGIScriptAlias / /path/to/project_name/releases/current/project_name/project_name.wsgi
        <Directory /path/to/project_name/releases/current/project_name>
            Order deny,allow
            Allow from all
        ErrorLog /var/log/apache2/error.log
        LogLevel warn
        CustomLog /var/log/apache2/access.log combined

Now create a file called .gitignore, containing the following. This prevents the compiled python code being included in the repository and the archive we use for deployment.


You should now be ready to initialise a git repository in the top level project_name directory.

git init
git add .gitignore project_name
git commit -m "Initial commit"

All of that should leave you with


In reality you might prefer to keep your wsgi files and virtual host files elsewhere. The fabfile has a variable (config.virtualhost_path) for this case. You’ll also want to set the hosts that you intend to deploy to (config.hosts) as well as the user (config.user).

The first task we’re interested in is called setup. It installs all the required software on the remote machine, then deploys your code and restarts the webserver.

fab local setup

After you’ve made a few changes and commit them to the master Git branch you can run to deply the changes.

fab local deploy

If something is wrong then you can rollback to the previous version.

fab local rollback

Note that this only allows you to rollback to the release immediately before the latest one. If you want to pick a arbitrary release then you can use the following, where 20090727170527 is a timestamp for an existing release.

fab local deploy_version:20090727170527

If you want to ensure your tests run before you make a deployment then you can do the following.

fab local test deploy

The actual fabfile looks like this. I’ve uploaded a Gist of it, along with the docs, so if you want to improve it please clone it.

# globals
config.project_name = 'project_name'
# environments
def local():
    "Use the local virtual server"
    config.hosts = ['']
    config.path = '/path/to/project_name'
    config.user = 'garethr'
    config.virtualhost_path = "/"
# tasks
def test():
    "Run the test suite and bail out if it fails"
    local("cd $(project_name); python manage.py test", fail="abort")
def setup():
    Setup a fresh virtualenv as well as a few useful directories, then run
    a full deployment
    require('hosts', provided_by=[local])
    sudo('aptitude install -y python-setuptools')
    sudo('easy_install pip')
    sudo('pip install virtualenv')
    sudo('aptitude install -y apache2')
    sudo('aptitude install -y libapache2-mod-wsgi')
    # we want rid of the defult apache config
    sudo('cd /etc/apache2/sites-available/; a2dissite default;')
    run('mkdir -p $(path); cd $(path); virtualenv .;')
    run('cd $(path); mkdir releases; mkdir shared; mkdir packages;', fail='ignore')
def deploy():
    Deploy the latest version of the site to the servers, install any
    required third party modules, install the virtual host and
    then restart the webserver
    require('hosts', provided_by=[local])
    import time
    config.release = time.strftime('%Y%m%d%H%M%S')
def deploy_version(version):
    "Specify a specific version to be made live"
    require('hosts', provided_by=[local])
    config.version = version
    run('cd $(path); rm releases/previous; mv releases/current releases/previous;')
    run('cd $(path); ln -s $(version) releases/current')
def rollback():
    Limited rollback capability. Simple loads the previously current
    version of the code. Rolling back again will swap between the two.
    require('hosts', provided_by=[local])
    run('cd $(path); mv releases/current releases/_previous;')
    run('cd $(path); mv releases/previous releases/current;')
    run('cd $(path); mv releases/_previous releases/previous;')
# Helpers. These are called by other functions rather than directly
def upload_tar_from_git():
    require('release', provided_by=[deploy, setup])
    "Create an archive from the current Git master branch and upload it"
    local('git archive --format=tar master | gzip > $(release).tar.gz')
    run('mkdir $(path)/releases/$(release)')
    put('$(release).tar.gz', '$(path)/packages/')
    run('cd $(path)/releases/$(release) && tar zxf ../../packages/$(release).tar.gz')
    local('rm $(release).tar.gz')
def install_site():
    "Add the virtualhost file to apache"
    require('release', provided_by=[deploy, setup])
    sudo('cd $(path)/releases/$(release); cp $(project_name)$(virtualhost_path)$(project_name) /etc/apache2/sites-available/')
    sudo('cd /etc/apache2/sites-available/; a2ensite $(project_name)')
def install_requirements():
    "Install the required packages from the requirements file using pip"
    require('release', provided_by=[deploy, setup])
    run('cd $(path); pip install -E . -r ./releases/$(release)/requirements.txt')
def symlink_current_release():
    "Symlink our current release"
    require('release', provided_by=[deploy, setup])
    run('cd $(path); rm releases/previous; mv releases/current releases/previous;', fail='ignore')
    run('cd $(path); ln -s $(release) releases/current')
def migrate():
    "Update the database"
    run('cd $(path)/releases/current/$(project_name);  ../../../bin/python manage.py syncdb --noinput')
def restart_webserver():
    "Restart the web server"
    sudo('/etc/init.d/apache2 restart')








List of available env options

I extracted this list from state.py (0.9b1). Or view the tip version

env.reject_unknown_hosts = True         # reject unknown hosts
env.disable_known_hosts = True          # do not load user known_hosts file
env.user = 'username'                   # username to use when connecting to remote hosts
env.password = 'mypassword'             # password for use with authentication and/or sudo
env.hosts = ['host1.com', 'host2.com']  # comma-separated list of hosts to operate on
env.roles = ['web']                     # comma-separated list of roles to operate on
env.key_filename = 'id_rsa'             # path to SSH private key file. May be repeated.
env.fabfile = '../myfabfile.py'         # name of fabfile to load, e.g. 'fabfile.py' or '../other.py'
env.warn_only = True                    # warn, instead of abort, when commands fail
env.shell = '/bin/sh'                   # specify a new shell, defaults to '/bin/bash -l -c'
env.rcfile = 'myfabconfig'              # specify location of config file to use
env.hide = ['everything']               # comma-separated list of output levels to hide
env.show = ['debug']                    # comma-separated list of output levels to show
env.version = '1.0'
env.sudo_prompt = 'sudo password:'
env.use_shell = False
env.roledefs = {'web': ['www1', 'www2', 'www3'],
                'dns': ['ns1', 'ns2'],
env.cwd = 'mydir'