Drupal meets Fabric: Deploying Drupal and Performing Different Admin Related Tasks with Fabric

07 Nov · by Tim Kamanin · 3 min read

Yesterday, during a meeting with my Drupal friends, I showed them some Python magic including Fabric (http://www.fabfile.org/) and how it can be used to make your Drupal dev life easier. My friends been blown away ;) I promised to share my regular fabfile.py for a Drupal project, please see it below, just a note: this is really beginner's code and I'm not responsible for any damages or losses it can bring. Okay, here it is:

import datetime
import os
from fabric.api import *

remote_home = '/var/www/vhosts/example.com/' # remote root dir of your Drupal project, CHANGE THIS.

proj = {
'name': 'example', # alias of the project. CHANGE THIS.
'remote_dir': remote_home + 'httpdocs', # http dir of your Drupal project. CHANGE THIS.

env.hosts = [''] # put your remote hosts here. CHANGE THIS.
env.passwords = {
'': 'remote host password goes here if you wish', # CHANGE THIS.
env.warn_only = True

env.user = 'remote env user' #your remote host login # CHANGE THIS.

def push():
Push local git repo.
local("git push", capture=False)

def pull():
Pull remote and merge it with your dev branch.
local("git checkout master && git pull && git checkout tim && git merge master", capture=False)

def staging_pull():
Pull git on a remote staging, update drupal db, revert features and clear caches. You're ready to go!
with cd(proj['remote_dir']):
run('git stash')
run('git stash drop')
run('git pull')
run('drush updatedb -y')
run('drush features-revert-all -y')
run('drush cc all')

def deploy():
Deploy the whole stuff on remote.

def sql_dump(path='/tmp'):
Do a local sql dump.
time = datetime.datetime.now().strftime("%Y-%m-%d-%H-%M-%S")
filename = '{}/{}_{}.sql.gz'.format(path, proj['name'], time)
local('drush sql-dump --gzip > ' + filename)

def remote_sql_dump(path='/tmp'):
Does a local dump on remote. Puts resulting db into /tmp by default or other directory if you specify.
print 'Starting remote db dump for "{}" project'.format(proj['name'])
with cd(proj['remote_dir']):
time = datetime.datetime.now().strftime("%Y-%m-%d-%H-%M-%S")
filename = '{}/{}_{}.sql.gz'.format(path, proj['name'], time)
run('drush sql-dump --gzip > ' + filename)
print 'Exported DB for project "{}" to {}'.format(proj['name'], filename)
return filename

def copy_file_from_remote(path, dest='/tmp'):
Get a file from remote host.
print 'Copying file {} from remote to {} at local'.format(path, dest)
return get(path, dest)

def remote_to_local_sql_dump(dest='/tmp'):
Make a dump on a remote and then copy it to local.
filepath = remote_sql_dump()
copy_file_from_remote(filepath, dest)
run('rm -f {}'.format(filepath))
final_filepath = '{}/{}'.format(dest, filepath.split('/')[-1])
print 'Finished dump and copy, you can now find your file at ' + final_filepath
print final_filepath
return final_filepath

def sql_sync():
Makes a dump on remote, then copies it to your local and imports it into your project.
filepath = remote_to_local_sql_dump('/tmp')
with lcd(os.getcwd()):
local('drush sql-drop && gunzip -c ' + filepath + ' | drush sql-cli')

def en_modules():
Enable some modules. In my case these are dev ones.
with lcd(os.getcwd()):
local('drush en -y views_ui devel search_krumo get_form_id')

1) Put this code into fabfile.py and put the file into your local Drupal root dir. 2) Type in command line:

easy_install fabric


pip install fabric

if you're on linux and don't have fabric installed, it will download and install fabric for you. This needed to be done once in a lifetime. 3) In the file, change data on the lines marked as _CHANGE THIS_ in comments. 4) Now cd into your Drupal project dir and type:

fab --list

To get a list of available commands. 5) To add your own command just add another python function to the file, that's easy! My favorite functions in this file are:

fab sql_sync


fab deploy



Required for comment verification