Docker and Microservices
In the devops space it is hard to deny that docker is currently the hot tool that everyone is talking about. In this article, I will give a quick introduction to what docker is, why it is useful, and a quick real life example of what it takes to deploy a python web application with a redis database, served in isolated docker containers.
What is Docker?
Docker is simply a lightweight OS-level virtualisation tool that utilises the linux kernel to isolate or contain a service without the use of more heavyweight virtual machines. Although it is not necessarily new or revolutionary technology, due to being fast, robust and functional it has taken the devops world by a storm.
What are Microservices?
Microservices are a great way to develop and deploy modern applications. They allow decoupling and separation of concerns in a software. This means that different parts of an application can be created, deployed or even scaled separately, allowing quick and continuous updates and delivery of the application to live production systems.
How to Install Docker and Docker Compose on Ubuntu 16.04:
Docker has always been linux first, though more recently the Docker team has made a very attractive tool for the Mac OS and Windows developers called the Docker Toolbox.
It is very easy to get started with docker on Ubuntu:
wget -qO- https://get.docker.com/ | sh sudo usermod -aG docker $(whoami)
Note: if you (rightly) don’t trust third party scripts check Docker’s own installation documentation here.
Then you need to install Docker Compose – a great utility that simplifies creation of a multi-container application in a single file:
sudo pip install docker-compose -y
Let’s make a simple Python app
In order to demonstrate the power of docker we can create a simple python web application that has both a web frontend using the Flask micro framework and a redis database:
from flask import Flask from redis import Redis app = Flask(__name__) redis = Redis(host='redis', port=6379) @app.route('/') def hello(): redis.incr('hits') return 'hello world! I have been %s times.\n' % redis.get('hits') if __name__ == "__main__": app.run(host="0.0.0.0", debug=True)
This app simply says hello world and updates a counter every time the application is hit, saving the data to a redis database.
In order to build this at run time we need this simple requirements.txt:
Note: if you were to run this app with “python app.py” you will face an error, as you probably do not have a redis service running locally (unless you do!)
Let’s deploy it with Docker
In order to serve this application as separate docker containers we first need a Dockerfile. This just describes to use python:2.7 docker image, where to mount the code inside the container and how to build the required libraries for our python app.
FROM python:2.7 ADD . /code WORKDIR /code RUN pip install -r requirements.txt
Next we will orchestrate our containers in a multi-container environment using a docker-compose file which describes the different services (i.e. web and redis):
version: '2' services: web: build: . command: python app.py ports: - "5000:5000" volumes: - .:/code links: - redis redis: image: redis:latest
Now all we need to do is run the multi-container docker environment with the simple command:
That’s it! The web and redis docker containers are built at run time and you can access the app locally in your browser at http://0.0.0.0:5000.
As you can see, Docker is a very attractive tool that uses allows us to create isolated environments very quickly and with little overhead. I hope you found this tutorial useful. If you have any issues feel free to contact me on twitter.
You can find all the code for this tutorial in this github repository.