A few weeks ago, I released a simple Django app to manage models for simple newsletters. The package itself is blatantly useless if we do not use it on a project that integrates with other parts, so I created a fully-equipped system that can create propaganda, assign it to subscribers and queue it ready to be sent whenever I need to.

The architecture

First, I have been using Amazon SES for a while to send newsletters to customers, both in-house and for clients and third-parties. It removes the hassle of maintaining SMTPs, email server management, network configuration, and meeting rigorous Internet Service Provider (ISP) standards for email content. So we will take profit of all those advantages, and make our life more simple and enjoyable.

Second, we want a smooth and efficient delivery through concurrent background tasks, seamless for the user and detached from the front-end business logic. Whenever we press the "send" button we expect the system to return a confirmation response immediately — regardless of what's happening in the background — instead of getting the request frozen, waiting too much time for the email queue to be emptied and ending up with a web server timeout response. Celery is one of the most used solutions on the Python and Django world that allows asynchronous task queue/job queue based on distributed message passing.

Last, but not least, because we love Django and Python, we will be using different libraries and Django apps that will allow us to build our stuff nicely.

Python and Django Requirements

Below is a list of packages, Python libraries and Django apps we will need to install (using Pip on most cases) and configure following the instructions provided on every case. I, therefore, don't provide such instructions.

django-celery

This app provides Celery integration for Django. Celery is automatically installed as a dependency.

django-seacucumber

django-ses is key to consume the Amazon SES API exposed, in turn, by another Python package, Boto (automatically installed as a dependency as well). On this tutorial, though, we will be using django-seacucumber, an application that integrates django-ses with django-celery, so that email sent from our platform will be routed through Amazon SES via Celery. How cool is that?

django-mailer

This old and well-known application will be in charge of queuing our email and having it ready to be delivered once the time arrives.

django-propaganda (optional)

This django simple newsletters app takes care of handling our content (Propaganda model) and scheduled delivery (Pamphlets model) to our subscribers (Subscribers model). You can also choose among a wide range of solutions or write your own.

Non Python-related Requirements

Below is a list of non Python-related software we will need to install (using package managers like apt-get for Debian-based systems or homebrew on OSX-based systems, on most cases) and configure following the instructions provided on every documentation. Again, I don't provide such instructions.

Message broker

RabbitMQ is the default broker for Celery and the one I have used so far, and I have no complaints at the moment.

Supervisor

Supervisor (also know as supervisord) is my chosen method to daemonize and run Celery on production systems. Basically because I can run multiple instances of Celery — one for each Django project running on a same machine — and restart processes selectively.

Project configuration

I am providing here what I considered a minimum configuration to integrate all the parts into a project aiming the purpose of this post.

Settings

These are the Django settings you should have:

# Append to your existing apps
INSTALLED_APPS = (
        ...
    'propaganda',
    'mailer',
    'djcelery',
    'seacucumber',
        ...
)

# Route email through Amazon SES via Celery
EMAIL_BACKEND = 'seacucumber.backend.SESBackend'
MAILER_EMAIL_BACKEND = 'seacucumber.backend.SESBackend'

# Log in to Amazon SES and get these
AWS_ACCESS_KEY_ID = 'your_key_id'  # Amazon Simple Email Services key ID
AWS_SECRET_ACCESS_KEY = 'your_access_key'  # Amazon Simple Email Services access key

# Must be an email authorized on Amazon SES
DEFAULT_FROM_EMAIL = 'youremail@example.com'

# Celery loader
import djcelery
djcelery.setup_loader()

# Default broker settings, change it if you need to
BROKER_HOST = "localhost"
BROKER_PORT = 5672
BROKER_USER = "guest"
BROKER_PASSWORD = "guest"
# Make sure this is unique on production systems with more than one
# Celery project using the same RabbitMQ instance
BROKER_VHOST = "/"

Celery task

Assuming you have done your homework and have your project queuing emails on django-mailer, let's create a tiny task on a tasks.py file that will make the final delivery asynchronous, so we can call it from any part of our code:

from celery.task import task

@task()
def send_queued_emails(*args, **kwargs):
    from mailer.engine import send_all
    send_all()

Deployment

This is the celeryd.conf settings file I load to get Celery running via supervisord:

[program:mailing_prj__celery]
; Using virtualenv, as any reasonable python developer
command=/path/to/virtualenv/prj/bin/python /path/to/prj/manage.py celeryd -B -E -l info
directory=/path/to/prj/
numprocs=1
; Like to keep stdout and stderr on same log
stdout_logfile=/path/to/prj/logs/celeryd.log
stderr_logfile=/path/to/prj/logs/celeryd.log
autostart=true
autorestart=true
startsecs=10

; Need to wait for currently executing tasks to finish at shutdown.
; Increase this if you have very long running tasks.
stopwaitsecs = 600

; if rabbitmq is supervised, set its priority higher
; so it starts first
priority=998

; If supervisord is run as root, we could subprocess with another user
user=your_virtualenv_user

On our supervisord configuration file, load the configuration file above:

[include]
files = /path/to/celeryd.conf

Run supervisord and, voilà!, you can start sending emails asynchronously. Who needs MailChimp, Sendgrid or Campaign Monitor?