0

Fairly new to Docker, I am trying to add the execution of a custom sql script (triggers and functions) to Django's migration process and I am starting to feel a bit lost. Overall, What I am trying to achieve follows this pretty clear tutorial. In this tutorial, migrations are achieved by the execution of an entry point script. In the Dockerfile:

# run entrypoint.sh
ENTRYPOINT ["/usr/src/my_app/entrypoint.sh"]

Here is the entrypoint.sh:

#!/bin/sh

if [ "$DATABASE" = "postgres" ]
then
    echo "Waiting for postgres..."

    while ! nc -z $SQL_HOST $SQL_PORT; do
      sleep 0.1
    done

    echo "PostgreSQL started"
fi

# tried several with and without combinations
python manage.py flush --no-input 
python manage.py makemigrations my_app 
python manage.py migrate

exec "$@"

So far so good. Turning to the question of integrating the execution of custom sql scripts in the migration process, most articles I read (this one for instance) recommend to create an empty migration to add the execution of sql statements. Here is what I have in my_app/migrations/0001_initial_data.py

import os
from django.db import migrations, connection

def load_data_from_sql(filename):
    file_path = os.path.join(os.path.dirname(__file__), '../sql/', filename)
    sql_statement = open(file_path).read()
    with connection.cursor() as cursor:
        cursor.execute(sql_statement)

class Migration(migrations.Migration):
    dependencies = [
        ('my_app', '0001_initial'),
    ]

    operations = [
        migrations.RunPython(load_data_from_sql('my_app_base.sql'))
    ]

As stated by dependencies, this step depends on the initial one (0001_initial.py):

from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion


class Migration(migrations.Migration):

    initial = True

    dependencies = [
        migrations.swappable_dependency(settings.AUTH_USER_MODEL),
    ]

    operations = [
        migrations.CreateModel(
            name='Unpayed',
            fields=[
etc etc 

[The Issue] However, even when I try to manually migrate (docker-compose exec web python manage.py makemigrations my_app), I get the following error because the db in the postgresql container is empty:

File "/usr/src/my_app/my_app/migrations/0001_initial_data.py", line 21, in Migration
    migrations.RunPython(load_data_from_sql('my_app_base.sql'))
  File "/usr/local/lib/python3.7/site-packages/django/db/backends/utils.py", line 82, in _execute
....
    return self.cursor.execute(sql)
    django.db.utils.ProgrammingError: relation "auth_user" does not exist

[What I do not understand] However, when I log in the container, remove 0001_initial_data.py and run ./entrypoint.sh, everything works like a charm and tables are created. I can add 0001_initial_data.py manually later on, run entrypoint.sh angain and have my functions. Same when I remove this file before running docker-compose up -d --build: tables are created.

I feel like I am missing some obvious way and easier around trying integrate sql script migrations in this canonical way. All I need is this script to be run after 0001_initial migration is over. How would you do it?

[edit] docker-compose.yml:

version: '3.7'

services:
  web:
    build: 
      context: ./my_app
      dockerfile: Dockerfile
    command: python /usr/src/my_app/manage.py runserver 0.0.0.0:8000
    volumes:
      - ./my_app/:/usr/src/my_app/
    ports:
      - 8000:8000
    environment:
      - SECRET_KEY='o@@xO=jrd=p0^17svmYpw!22-bnm3zz*%y(7=j+p*t%ei-4pi!'
      - SQL_ENGINE=django.db.backends.postgresql
      - SQL_DATABASE=postgres
      - SQL_USER=postgres
      - SQL_PASSWORD=N0tTh3D3favlTpAssw0rd
      - SQL_HOST=db
      - SQL_PORT=5432
    depends_on:
      - db
  db:
    image: postgres:10.5-alpine
    volumes:
      - postgres_data:/var/lib/postgresql/data/ 

volumes:
  postgres_data:

django:2.2 python:3.7

6
  • Why are you running makemigrations in your Docker file? Is my_app/migrations/0001_initial.py checked in? Commented Apr 29, 2019 at 15:44
  • "checked in"? You mean, in the container after I docker-compose up? Yes it is. And about why makemigrations is in the entrypoint? To be completely honest with you, since I find this "create an empty migration to execute custom sql during migration" trick pretty confusing, just to figure out whether it changed something. I thought that makemigrations had to be run before migrate Commented Apr 29, 2019 at 16:10
  • „Checked in“ as in „checked in to version control“. Or in other words, does the migration 0001_initial exist before you docker-compose up? Commented Apr 29, 2019 at 16:13
  • Yes it does. It is in my source files before I docker-compose up (if I am getting your point correctly) Commented Apr 29, 2019 at 16:18
  • please share your compose file. also what version of python and django? Commented Apr 29, 2019 at 16:30

2 Answers 2

1

I believe the issue has to do with you naming the migration file and manually making your dependencies with the same prefix "0001" The reason I say this is because when you do reverse migrations, you simply can just reference the prefix. IE if you wanted to go from your 7th migration to your 6th migration. The command looks like this python manage.py migrate my_app 0006 Either way, I would try deleting and creating a new migration file via python manage.py makemigrations my_app --empty and moving your code into that file. This should also write the dependencies for you.

The error message alongside the test you ran with adding he migration file after is indicative of the issue though. Some how initial migrations aren't running before the other one. I would also try dropping your DB as it may have persisted some bad state ./manage.py sqlflush

Sign up to request clarification or add additional context in comments.

3 Comments

I just tried renaming the file into 0002_create_custom_functions.py (manually and with makemigrations --empty) but the problem remains. The only way I could have the table is by only letting 0001_initial.py when I build. After each attempt, I drop the DB by docker-compose down -v. I am a bit confused about the flush --no-input and migrate command in entrypoint.sh. Am I right to assume that all the migrations in /my_app/migrations should be run by migrate?
Do you think of an easier way to execute this script afterwark (outside of the classical migrations process)? the app's cointainer does not have psql but the db's does. Maybe send the sql script from app to db, log into it and run it (like I would do manually). I noticed that manage.py has some kind of dbshell. Any chance I could run the script through it?
so a few things. the way i do things is i write a bootscript. lets say boot.sh. I put all migrate and runserver in there. i do not put makemigrations in that bootscript as I run it that manually an commit the migration files to the repo. and instead of having all those commands in teh docker file i have it as a compose as command command: bootprod.sh bootprod consists of pip install -r requirements.txt python3 manage.py migrate && python3 manage.py runserver
0

[The easiest way I could find] I simply disentangled django migrations from the creation of custom functions in the DB. Migration are run first so that the tables exists when creating the functions. Here is the entrypoint.sh

#!/bin/sh

if [ "$DATABASE" = "postgres" ]
then
    echo "Waiting for postgres..."

    while ! nc -z $SQL_HOST $SQL_PORT; do
      sleep 0.1
    done

    echo "PostgreSQL started"
fi

python manage.py flush --no-input
python manage.py migrate

# add custom sql functions to db 
cat my_app/sql/my_app_base.sql | python manage.py dbshell

python manage.py collectstatic --no-input

exec "$@"

Keep in mind that manage.py dbshell requires a postgresql-client to run. I just needed to add it in the Dockerfile:

# pull official base image
FROM python:3.7-alpine

...........

# install psycopg2
RUN apk update \
    && apk add --virtual build-deps gcc python3-dev musl-dev \
    && apk add postgresql-dev postgresql-client\
    && pip install psycopg2 \
    && apk del build-deps

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.