jeanphix

Django, heroku & S3 FTW

08 Feb 2012

Outdated Heroku now "automatically runs collectstatic on deployment. By the way, I encourage you to put all your environment specific configs (like database dsn, debug...) into environment variables.

Today I’ll show you my tips backed from a django project deployment over heroku (application) and S3 (statics).

Setting up environments

First of all, I need two remote environments (production and staging) with distinct settings. In order to feet my git workflow, I had to create three local branches:

Project bootstrap:

$ pip install django gunicorn psycopg2
$ pip freeze > requirements.txt
$ git init
$ django-admin.py startproject myproject
$ python myproject/manage.py startapp myapp

Then in myproject/settings.py I add ‘gunicorn’ to INSTALLED_APPS.

$ echo 'web: python myproject/manage.py run_gunicorn -b "0.0.0.0:$PORT" -w 3' > Procfile
$ git add .
$ git commit -m 'Initial commit'

Production:

$ heroku create --stack cedar --remote production
$ git push production master

Staging:

$ git checkout -b staging
$ heroku create --stack cedar --remote staging
$ git push staging master
$ git checkout -b development

Then the workflow is as easy as:

Now I need a way to setup environment specific settings, here is the pattern I used:

myproject/settings.py:

import os
import imp
import socket
import subprocess


LOCAL_HOSTNAMES= ('myhost',)
HOSTNAME = socket.gethostname()

PROJECT_ROOT = os.path.dirname(os.path.realpath(__file__))

def get_environment_file_path(env):
    return os.path.join(PROJECT_ROOT, 'config', '%s.py' % env)

if 'APP_ENV' in os.environ:
    ENV = os.environ['APP_ENV']
elif HOSTNAME in LOCAL_HOSTNAMES:
    branch = subprocess.check_output(
        ['git', 'rev-parse', '--abbrev-ref', 'HEAD']).strip('\n')
    if os.path.isfile(get_environment_file_path(branch)):
        ENV = branch
    else:
        ENV = 'development'

try:
    config = imp.load_source('env_settings', get_environment_file_path(ENV))
    from env_settings import *
except IOError:
    exit("No configuration file found for env '%s'" % ENV)

Then I can put my common settings into settings.py and specific settings into config/{branch}.py

To get it work on remote env, just set the APP_ENV this way:

$ heroku config:add APP_ENV=staging --remote staging
$ heroku config:add APP_ENV=production --remote production

That’s it, the application now switches to the appropriate config file for current branch and can be override by setting APP_ENV environment variable.

django-compressor feat. django-storages

The other tip I’ll show you here is to configure django-compressor and django-storages to work together on previous set environments.

The expected behaviours I need:

* When settings.DEBUG is True:
** statics have to be delivered by the application (or heroku nginx on remotes).
** compressor has to be disabled
* When settings.DEBUG is FALSE:
** statics have to be delivered by Amazon S3 CDN.
** compressor has to be enabled and upload compressed assets to S3

As compressor use “locally collected” statics, I need to collect it locally on remotes to, lets create a custom storage class, myproject/myapp/storage.py:

from django.core.files.storage import get_storage_class
from storages.backends.s3boto import S3BotoStorage

class CachedS3BotoStorage(S3BotoStorage):
    """S3 storage backend that saves the files locally, too.
    """
    def __init__(self, *args, **kwargs):
        super(CachedS3BotoStorage, self).__init__(*args, **kwargs)
        self.local_storage = get_storage_class(
            "compressor.storage.CompressorFileStorage")()

    def save(self, name, content):
        name = super(CachedS3BotoStorage, self).save(name, content)
        self.local_storage._save(name, content)
        return name

Then I add settings this way:

LOCAL_HOSTNAMES= ('myhost',)
# Statics
STATIC_URL = '/static/'

STATIC_ROOT = os.path.join(PROJECT_ROOT, 'static')

MEDIA_ROOT = os.path.join(STATIC_ROOT, 'media')
MEDIA_UPLOAD_ROOT = os.path.join(MEDIA_ROOT, 'uploads')

MEDIA_URL = STATIC_URL + 'media'

# Compressor
COMPRESS_ENABLED = DEBUG is False
if COMPRESS_ENABLED:
    COMPRESS_CSS_FILTERS = [
        'compressor.filters.css_default.CssAbsoluteFilter',
        'compressor.filters.cssmin.CSSMinFilter',
    ]
    COMPRESS_STORAGE = 'myapp.storage.CachedS3BotoStorage'
    COMPRESS_URL = STATIC_URL
    COMPRESS_OFFLINE = True

# Storages
if not DEBUG and HOSTNAME in LOCAL_HOSTNAMES:
    STATICFILES_STORAGE = 'myapp.storage.CachedS3BotoStorage'

Then add this lines on myproject/config/development.py:

if not DEBUG:
    STATIC_URL = 'https://xxx-dev.s3.amazonaws.com/'

AWS_ACCESS_KEY_ID = "xxxxxxxxxxxxx"
AWS_SECRET_ACCESS_KEY = "xxxxxxxxxxxxxxxxxxxxxxxxxxxx"
AWS_STORAGE_BUCKET_NAME = "xxx-dev"

And modify the Profile this way:

web: python project/manage.py collectstatic --noinput; python myproject/manage.py compress; python myproject/manage.py run_gunicorn -b "0.0.0.0:$PORT" -w 3

Ok, so now, to collect static over S3 I just need to set myproject.config.{branch}.DEBUG to False and then run:

$ python myproject/manage.py collectstatic --noinput

Note: collecting statics via a worker is probably a better practice but it will increase your heroku bills…

Hope this will help you.

See you.