I want to deploy my little web application with docker on digitalocean. Problem is, that i didn't separate flask + celery, so everything is in one container.
Everytime i run docker-compose up the web container crashses ("exited with code 0").
The project structure looks as follows:
.
.
├── docker-compose.yml
├── nginx
│ ├── dockerfile
│ └── nginx.conf
└── web
├── app.ini
├── bot
│ ├── __init__.py
│ ├── main.py
│ ├── __pycache__
│ └── timer.py
├── celeryd.pid
├── config.py
├── dockerfile
├── dockerignore
├── flask_app.py
├── __init__.py
├── requirements.txt
├── tasks.py
└── webapp
├── bot
├── __init__.py
├── models.py
├── __pycache__
├── static
├── templates
└── users
This is the docker-compose.yml:
version: "3.7"
services:
rabbitmq:
image: rabbitmq:3-management
hostname: rabbitmq
container_name: rabbitmq
environment:
RABBITMQ_ERLANG_COOKIE: 'SWQOKODSQALRPCLNMEQG'
RABBITMQ_DEFAULT_USER: 'user123'
RABBITMQ_DEFAULT_PASS: 'password123'
RABBITMQ_DEFAULT_VHOST: '/webapp'
ports:
- "15672:15672"
- "5672:5672"
volumes:
- rabbitmq_storage:/var/lib/rabbitmq
networks:
netzwerk123:
aliases:
- rabbitmq
db:
image: mariadb
restart: always
container_name: db
environment:
MYSQL_ROOT_PASSWORD: 'password123'
MYSQL_USER: 'user123'
MYSQL_PASSWORD: 'password123'
MYSQL_DATABASE: 'webapp'
ports:
- "3306:3306"
volumes:
- db_storage:/var/lib/mysql
networks:
netzwerk123:
aliases:
- db
nginx:
build: ./nginx
container_name: nginx
restart: always
ports:
- "80:80"
networks:
netzwerk123:
aliases:
- ngnix
web:
build: ./web
container_name: web
hostname: web
restart: always
environment:
FLASK_DEBUG: 'True'
FLASK_APP: 'flask_app.py'
command: celery -A tasks.celery worker --loglevel=info --detach
expose:
- 8080
networks:
netzwerk123:
aliases:
- web
depends_on:
- nginx
- db
- rabbitmq
links:
- rabbitmq
tty: true
stdin_open: true
adminer: #only for development purposes
image: adminer
container_name: adminer
restart: always
ports:
- "8081:8080"
depends_on:
- db
networks:
netzwerk123:
aliases:
- adminer
volumes:
db_storage:
rabbitmq_storage:
networks:
netzwerk123:
If i comment "command: celery -A tasks.celery worker --loglevel=info --detach" everything runs just fine, since celery is not running. Webpage is fully functioning, database is active. Everything's fine, except for celery.
By uncommenting the command line, docker throws the "web exited with error code 0".
I guess this is due to the fact, that docker exits the container after successfully executing the command. So there are two problems i face:
1) How to run celery in the background of the same container of celery, so that there won't appear the error code 0?
2) How to run celery AND flask parallel in the container?
-> i already tried the following, but same error:
command: bash -c "uwsgi app.ini && celery -A tasks.celery worker --loglevel=info --detach"
Any help is highly appreciated :)
EDIT: The reason i put celery + flask in one container is my task.py. In here i import all sqlalchemy models from the web.webapp.models. So i need to have access to the flask directory and currently i don't see a way how to separate this into two containers