How to dockerize Django Web application

Hey there, welcome! In this article, I am going to teach you how to package and distribute an API built in Django and Django REST Framework, using Docker containers.

For this article, I used an API I built in part 1 and part 2 of my articles on Django REST Framework (DRF). However if you already have an existing Django project you want to containerize right away, you can also follow along as we go, step by step.

What is Docker?

Docker is an open platform that performs Operating System level virtualization also known as containerization.

Container Vs Virtual Machine — Image Credit: Docker

To help you put this into perspective, let’s compare Docker to a Virtual Machine(VM) like Virtual Box;

A VM will get physical resources like RAM, CPU, Network cards, etc and “slice and dice” them into virtual resources. It then provides the virtual resources on top of a hypervisor as smaller Virtual Machines that look and feel like the normal physical computer where you can then install a guest Operating System.

Docker “slices and dices” Operating System resources and not the physical resources into what they prefer to call containers.

Given this definition, I can authoritatively tell you that Docker containers are different from VMs.

Then, what is a container?

Image credit: Docker

A container image is a lightweight, stand-alone, executable package of a piece of software that includes everything needed to run it: code, runtime, system tools, system libraries, settings. Available for both Linux and Windows based apps, containerized software will always run the same, regardless of the environment. Containers isolate software from its surroundings, for example differences between development and staging environments and help reduce conflicts between teams running different software on the same infrastructure — Docker documentation.

Let’s put the above concept of a container into perspective with this scenario;

Imagine that you have finished developing your API on a Ubuntu linux box. You have tested it on your local Ubuntu machine and it works well. Then, your boss asks you to deploy it on a production machine running Fedora Linux.

Image credit: derickbailey.com

After some good hours setting the API up on the production machine, you then ask your boss to try out.

She/he later comes to your desk and says this to you; “It’s not working!”.

Wait!, but it works on your Ubuntu dev machine! — “Now, what could be the problem?”, you ask yourself.

It then hits you that there are a couple of dependencies you need to install on the production machine running Fedora to make the API work like it does on your local Ubuntu machine — Aaaaah, all that hustle and headache all over again! Is there a solution to this problem?

I want to tell you that you can solve the problem in the above scenario by packaging the API (plus all it’s dependencies) with a “slice” of your OS (Ubuntu in this case) and deploying it on the production server — Nice, right?! It should now work as if it’s running on your local machine. This action is called containerization (packaging your application into a container).

Enough talking, let’s Dockerize the API now.

Setup Docker

Before creating a container for the API (or your Django application) and shipping it off, you need to install Docker on your local machine. For learning purpose, you will install Docker Community Edition. Select your OS from the list below and follow the setup instructions;

  1. Mac OS
  2. Windows
  3. Ubuntu
  4. Debian
  5. Cent OS
  6. Fedora

Create a Dockerfile

With Docker setup on your local machine, the next thing to do is create a Dockerfile. The Dockerfile will have a set of instructions on how Docker will build a container image for your application.

With that said, now go ahead and add a file named Dockerfile, with no file extension to your project root directory. As an example for your reference, here is the directory structure of my project with the Dockerfile added to the root directory of my API project from part 1 and part 2 on DRF.

api/
Dockerfile
README.md
requirements.txt
manage.py
api/
__init__.py
settings.py
urls.py
wsgi.py
music/
migrations/
__init__.py
__init__.py
admin.py
apps.py
models.py
tests.py
views.py
venv/

Open the Dockerfile in your editor and add the following lines;

The first directive in the Dockerfile, FROM python:3.6 tells Docker which image to base our container on. We use the official Python image from Dockerhub that comes with Python and Linux setup for you, ready for use in a python project.

The next directive ENV PYTHONUNBUFFERED 1is an environment variable, which instructs Docker not to buffer the output from Python in the standard output buffer, but simply send it straight to the terminal.

The directive that starts with RUN instructs Docker to execute what ever command that comes after as if you were executing it in a terminal on a server.

In the Dockerfile aboveRUN mkdir /music_service instructs Docker to create a folder in the container called music_service which we shall use to store our application files.

The directive that starts with WORKDIR sets the working directory and all the directives that follow in the Dockerfile will be executed in that directory.

In the Dockerfile above, I set the working directory to music_service . I then ADD all files in my project root directory where the Dockerfile is to the music_service directory in the container. The ADD directive copies files and directories from the source to the destination specified in the directive.

The last directive instructs Docker to RUN the pip command to install the requirements listed in the requirements.txt .

With that, we have our Dockerfile for the container image.

Docker compose

Compose is a tool for defining and running multi-container Docker applications. — Docker documentation

A typical API deployment to a production environment will require you to use more than one container. You will need a separate container for the web server and a separate container for the database server. Docker compose will assist you in specifying how you want the containers to be built and connected, using a single command.

On Docker for Mac and Docker for WindowsDocker compose comes pre-installed, so you’re good-to-go. On Linux systems you need to install it directly.

You can check that you have it installed by issuing this command docker-compose -vin your terminal. The output of this command should be the version of the Docker compose installed on your machine.

If you have docker compose on your development machine, go ahead and create a docker-compose.yml file in your root directory, where the Dockerfile resides.

Open the docker-compose.yml file and add the following lines;

The first line in the docker-compose.yml file specifies which syntax version of Docker compose you want to use.

Next, we define a service called web . The build directive tells Docker compose to build an image from the files in the project root directory. The command directive is the default command that will be executed when Docker runs the container image.

The container_name directive assigns a name for the container. If no name is specified, Docker will assign the container a random name. The volume directive mounts the project root directory to the container music_service folder. In essence what this does is; It makes sure that when I edit any file in the project folder, the container folder is updated immediately.

Lastly we expose the port we want to access the container on using the ports directive.

With that done, you can now build and run the container with the command; docker-compose up . If your build is successful, open your browser and access your API or application. In my case I access all my songs using my API on 127.0.0.1:8000/api/v1/songs . You can also try to access the Django admin at 127.0.0.1:8000/admin . If you can see the login page, then your container is up and running.

Take Aways

Apart from solving the problem of It works on my machine, but does not work on the production machine, other benefits of packaging software in containers are;

  1. Flexibility: Even the most complex applications can be containerized.
  2. Lightweight: Containers leverage and share the host kernel.
  3. Interchangeable: You can deploy updates and upgrades on-the-fly.
  4. Portability: You can build locally, deploy to the cloud, and run anywhere.
  5. Scalability: You can increase and automatically distribute container replicas.
  6. Stackable: You can stack services vertically and on-the-fly.

Additional resources

The most authoritative documentation I can refer you to is the official Docker documentation. It explains in depth every thing about Docker you need to know.

The Dockerfile reference here provides a complete list of all directives you can put in your Dockerfile to create images for your container.

The docker-compose file reference here provides a complete list of all directives you can put in your docker-compose.yml file to make creating multi-container images easy for you.

Lastly, you can also go to the github repo of the API used in this article. Use it as a reference for creating docker images for your own projects.

Comments

Popular posts from this blog

Documentation is Very vital before you develop any system or app

Steps followed when creating a new software

Everything you need to know when developing an on demand service app