Innovate anywhere, anytime withruncode.io Your cloud-based dev studio.
Server Management

Automate Django Deployments with Fabfile

2022-07-21

Fabric is a Python library and command-line tool with the ability to execute commands on a remote server. It is designed to use SSH to execute system administration and deployment tasks on one or more remote machines.

Usage

  • To automate administration tasks and deployments.
  • Typical usage involves creating a python module containing one or more functions, then executing them via the fab command-line tool.

Installation

Fabric requires Python version 2.5 or 2.6. The easiest ways of installing Fabric is via, pip, easy_install:

$ pip install fabric            (OR)
  $ sudo easy_install fabric      (OR)
  $ sudo apt-get install fabric

Fabric Functions

Now, we learn about some of the functions in Fabric. Fabric provides a set of commands in fabric.api module:

$ local   # To execute a local command.
  $ run     # To execute a remote command on all specific hosts, user-level 
            #  permissions.
  $ sudo    # To execute a command on the remote server with sudo (i.e. superuser)
            #  privileges.
  $ put     # To copy the local file to a remote destination.
  $ get     # To download a file from the remote server.
  $ prompt  # Prompt user with text and return the input (like raw_input).
  $ reboot  # To reboot the remote system, disconnect, and wait for wait seconds.

Creating your first 'fabfile' -

By default, it looks for something named either fabfile or fabfile.py. This is the file where you write your functions, roles and execute through the fabric.

The fabfile should be in the same directory where you run the Fabric tool.

First, create a file named fabfile.py and then start writing your functions. Below is a small “fabfile”:

from fabric.api import local

  def local_uname():
      local('uname -a')

Above function can be run using the following command...

$ fab local_uname

Connecting to remote servers

Here, we will see the use of env variable to manage information for connecting to remote servers.

# First we import the Fabric api
  from fabric.api import env, run

  # We can then specify host(s) and run the same commands across those systems
  env.user = 'username'

  env.key_filename = ['keyfile.pem']

  env.hosts = ['servername']

  def uname():
      run("uname -a")

The 'user' is the user-name used to login remotely to the servers, 'hosts' is a list of hosts to connect to and 'key_filename' may be a string or list of strings, referencing file paths to SSH key files to try when connecting.

Finally, when you run the fabfile, you can see that function uname() sucessfully runs on a remote server.

Roles

Inside fabfile, we can define a role to the set of servers and run the functions only on those servers.

from fabric.api import *

  env.roledefs = {
      'webservers': ['www1', 'www2'],
      'databaseservers': ['db1', 'db2']
  }

  env.user = 'fabuser'

  # Restrict the functions to the 'webservers' role
  @roles('webservers')
  def list_directories():
      run('ls /home')


  @roles('webservers')
  def create_directory():
      run("mkdir /home/tmp/")

Finally, below code lists a set of fabric commands to manage a Django deployment.

Deploy:

Push the latest code to the server, collect all static files, makemigrations, migrate, sync the db and restart the server

from fabric.api import *
  from fabric.contrib.project import rsync_project
  from fabric.contrib.files import exists

  @hosts(['exampleserver.com'])
  def deploy():
      # Create a directory on a remote server, if it doesn't already exists
      if not exists('path/to/project'):
          run('mkdir -p project_name')

      # Create a virtualenv, if it doesn't already exists
      if not exists(remote_env_path):
          with cd('path/to/project'):
              run('virtualenv env')

      # Sync the remote directory with the current project directory.
      rsync_project(local_dir='/local/project/dir/', remote_dir='/remote/project/dir/',
                      exclude=['.git'])

      # Activate the environment and install requirements
      run('source path/to/project/bin/activate')
       run('pip install -r path/to/project/requirements_file.txt')
      
      with cd('path/to/project'):
          # Collect all the static files
          run('python manage.py collectstatic')

          # Migrate and Update the database
          run('python manage.py makemigrations')
          run('python manage.py migrate')
          run('python manage.py syncdb')

      # Restart the nginx server
      run('service nginx restart')