Photo by Erwan Hesry on Unsplash

Docker Compose with Python and PosgreSQL

Stefano Passador

--

Docker helps developers create apps removing a lot of headaches about platform compatibility and dependencies.

What I will show in this simple tutorial is the creation of a Docker application similar to the one available here. The main difference is that my version uses a PostgreSQL database instead of a Redis one.

The prerequisite of the tutorial is to have Docker Engine and Docker Compose installed on your machine.

The full repository of this tutorial is available here.

Setup

First thing you have to do is create a project folder (and the folder we will need later). You can do this by running the following commands from a Terminal view.

PostgreSQL

PostgreSQL is a free and open-source relational DBMS that is SQL compliant. It features transactions with ACID properties (Atomicity, Consistency, Isolation, Durability). We use it as a replacement of Redis, which is a data structure project implementing an in-memory key-value database with optional durability.

For instantiating a PostgreSQL database through Docker we can use the official image. To do so I write the following Dockerfile inside the folder database :

In this file we can get the following:

  • FROM: this directive is used to identify the image from which we want to build the new image. I choose postgres:latest which is the official Docker Image with the tag latest that indicates the latest version (13).
  • ENV: with this directive, we are able to specify various environment variables. For this image I specified POSTGRES_PASSWORD, POSTGRES_USER, POSTGRES_DB .
  • COPY: used to copy the file specified create_fixtures.sql in a specific folder into the image created /docker-entrypoint-initb.d/ .

The copy of a file inside the folder /docker-entrypoint-initb.d/ is very useful because it allows us launch some initialization SQL commands. In this case, I’ve decided to create a simple table with two fields (see below).

To try out the database through SQL commands you can run:

All of this concludes our configuration of the PostgreSQL database.

Python

Now, it’s time to create the Python script that works together with the database. The created script (inside the app folder) is the following:

What we have done is:

  • Define the parameters to create the connection string needed for the SQLAlchemy, which allows us to make the connection to PostgreSQL. As you can see, the db_name, db_user, db_pass are the same indicated before as environment variables in the PostgreSQL Dockerfile. The db_host variable will be explained later, and the db_port is the default PostgreSQL port.
  • Define two functions add_new_row(n) that saves inside the database a new number n and get_last_row() that retrieves the last number inserted inside the database.
  • In the main section, I simply wrote an infinite loop that adds a new number in the database and then retrieves it.

For making the python script work, we specified the dependencies in the requirements.txt file.

At this point, to make the python part of this tutorial we create a Docker image through the Dockerfile inside the app folder.

Some of the directives used this time are the same as before (FROM, COPY), the others are:

  • WORKDIR: Used to specify the working directory (where our COPY/ADD directives copies files when no path is specified)
  • ADD: Similar to the COPY directives (I won’t go into the details of the difference)
  • RUN: Run a command during the building of the image. In this case, we install the libraries specified in requirements.txt (that has already been copied into the image working directory)
  • CMD: Similar to the RUN directive, but this is launch only when the image is started. It is the entrypoint of the image.

With this Dockerfile configuration, we also allow the caching of the requirements. This works because Docker uses its cache as long as the requirements.txt file has not been changed.

Put things together

The final step of the tutorial is the union of the two images that we created. The most elegant way to do that is by creating a docker-compose.yml file in the root of the project.

In this we are declaring two containers inside our application:

  • app: the one that is defined inside the /app/Dockerfile
  • db: the one in /database/Dockerfile

With the services name, we understand the db_host='db' inside the app.py , it is Docker that manages the networking between the two images after being launched, so it translates the db as the hostname of the database service.

The final command required to make everything run is docker-compose up — build. This builds your images and then start the containers.

If you have any tips, comments or questions, do not hesitate to ask me in the comments below.

--

--

Responses (4)