8

I followed a Docker + Django tutorial which was great, in that I could successfully build and run the website following the instructions. However, I can't for the life of me figure out how to successfully run a database migration after changing a model.

Here are the steps I've taken:

  1. Clone the associated git repo
  2. Set up a virtual machine called dev

    • with docker-machine create -d virtualbox dev
    • and point to it with eval $(docker-machine env dev)
  3. Built and started it up with:

    • docker-compose build
    • and docker-compose up -d
  4. Run initial migration (the only time I'm able to run a migration that appears successful):

    • docker-compose run web python manage.py migrate
  5. Checked that the website works by navigating to the IP address returned by:

    • docker-machine ip dev
  6. Make a change to a model. I just added this to the Item model in web/docker_django/apps/todo/models.py file.:

    • name = models.CharField(default='Unnamed', max_length=50, null=False)
  7. Update the image and restart the containers with:

    • docker-compose down --volumes
    • then docker-compose build
    • then docker-compose up --force-recreate -d

Migration attempt number 1:

I used:

docker-compose run web python manage.py makemigrations todo

Then:

docker-compose run web python manage.py migrate

After the makemigrations command, it said:

Migrations for 'todo':
  0001_initial.py:
    - Create model Item

When I ran the migrate command, it gave the following message:

Operations to perform: 
  Synchronize unmigrated apps: messages, todo, staticfiles 
  Apply all migrations: contenttypes, admin, auth, sessions 
Synchronizing apps without migrations: 
  Creating tables... 
    Running deferred SQL... 
  Installing custom SQL... 
Running migrations: 
  No migrations to apply. 

So that didn't work.

Migration attempt number 2:

This time I tried running migrations from directly inside the running web container. This looked like this:

(macbook)$ docker exec -it dockerizingdjango_web_1 bash
root@38f9381f179b:/usr/src/app# ls
Dockerfile  docker_django  manage.py  requirements.txt  static  tests
root@38f9381f179b:/usr/src/app# python manage.py makemigrations todo
Migrations for 'todo':
  0001_initial.py:
    - Create model Item
root@38f9381f179b:/usr/src/app# python manage.py migrate
Operations to perform:
  Synchronize unmigrated apps: staticfiles, messages
  Apply all migrations: contenttypes, todo, admin, auth, sessions
Synchronizing apps without migrations:
  Creating tables...
    Running deferred SQL...
  Installing custom SQL...
Running migrations:
  Rendering model states... DONE
  Applying todo.0001_initial...Traceback (most recent call last):
  File "/usr/local/lib/python3.5/site-packages/django/db/backends/utils.py", line 62, in execute
    return self.cursor.execute(sql)
psycopg2.ProgrammingError: relation "todo_item" already exists

Moreover, I couldn't find any migrations folders in that container.

I clearly have very little idea what's happening under the hood here, so if someone could show me how to successfully change models and run database migrations I would much appreciate it. Bonus points if you can help me conceptualize what's happening where when I run these commands that have to get the web and postgres images to work together.

EDIT: What worked for me

@MazelTov's suggestions will all be helpful for automating the process as I get more used to developing with Docker, but the thing I was missing, that @MazelTov filled me in on in a very helpful discussion, was mounting so that migrations show up in my local machine.

So basically, my Migration Attempt 1 would have worked just fine if instead of, for example:

docker-compose run web python manage.py makemigrations todo

...I used:

docker-compose run --service-ports -v $(pwd)/web:/usr/src/app web python manage.py makemigrations todo

2 Answers 2

9

There are many ways how to achieve this.

1) Run ./manage.py migrate before you start your app (uwsgi, runserver,...) in bash script

Dockerfile

FROM debian:latest

...

# entrypoint, must be executable file chmod +x entrypoint.sh
COPY entrypoint.sh /home/docker/entrypoint.sh

# what happens when I start the container
CMD ["/home/docker/entrypoint.sh"]

entrypoint.sh

#!/bin/bash

./manage.py collectstatic --noinput
# i commit my migration files to git so i dont need to run it on server
# ./manage.py makemigrations app_name
./manage.py migrate

# here it start nginx and the uwsgi
supervisord -c /etc/supervisor/supervisord.conf -n

2) If you have a lot of migration files and you dont want any downtime, you could run the migrate command from seperate docker-compose service

docker-compose.yml

version: '3.3'

services:  

  # starts the supervisor (uwsgi + nginx)
  web:
    build: .
    ports: ["80:80"]

  # this service will use same image, and once the migration is done it will be stopped
  web_migrations:
    build: .
    command: ./manage.py migrate
Sign up to request clarification or add additional context in comments.

7 Comments

Thanks for your response! I can't get any of your suggestions to work though...Would you be willing to help me out in a chatroom? Either way I'll try to describe what I'm running into. First, committing migration files sounds like a great idea, but I don't know how to get migration files that I generate in a docker container back out to my local files. Either way, putting makemigrations or migrate in the entrypoint doesn't seem to be doing anything...the second method of running migrations from a separate container complained about not finding manage.py...
the second approach require have WORKDIR in dockerfile set to the folder where you project is (manage.py file), because if you dont set WORKDIR in dockerfile you will most likely run command in root /
OK, I got something like the 2nd approach to partially work if I'm building from scratch on a fresh virtual machine, but then it still fails after I make any changes to my models. I think the main problem is that I don't know how to have migrations stick around after creation. I used the command bash -c "python manage.py makemigrations todo && python manage.py migrate" to do both commands within the docker container. When I stopped those containers, made a change to the model, and ran build and up again, it made another 0001_initial.py migration file and said "no migrations to apply"
So along those lines, do you know how I could use my docker setup to run makemigrations in a way that leaves the migration files in my host directory so they get copied to the docker container every time I rebuild? I would just make a local conda environment and install postgres locally and hope that is close enough to the docker environment to get everything right, but that would kind of defeat the point of having reproducible environments via docker
this is great example why you should build images on a remote machine (for example) Gitlab-CI because you would discover that all necessary files have to be commited to git.... to your question... when you build the image, i belive you copy the code inside the image, so why dont you run the makemigrations there?
|
2

I solved this by doing:

docker-compose exec web /usr/local/bin/python manage.py makemigrations todo

and then :

docker-compose exec web /usr/local/bin/python manage.py migrate 

I got it from this issue.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.