I'm working on a project that uses Django, I'm a beginner with Django and Docker.
For a feature, I decided to create a task with Celery (the feature consist to create a matchmaking queue)
But, I use Docker for run my app, and I have 4 containers : one is for the frontend (node), the second for Django, the third is for my database PostgreSQL and the last is my redis server.
And I don't really know if it's necessary to create another container for Celery. I found nothing with my research about a thing like that, but (if I say nothing wrong) in theory, it's possible and even very good to create a container for Celery, it's logic.
So, I tried to launch my celery task in background, then launch my Django server like this in the same container :
$ celery -A backend worker -l INFO --detach
$ daphne -p 8000 backend.asgi:application
TWO PROBLEMS : We lose the whole point of using Docker doing this AND we can't see the logs of celery.
Problem : it is "ok" to do that, or it's really better to create another container for Celery or only for my task ? If yes, how I can do this ? I have really 0 idea how I can do that and what is the good option.
I invoke your help, please ! If it's necessary to share more context (my Dockerfiles, the docker-compose.yml, etc...), let me know that !
Thanks everyone in advance 😁 !
command:in a Compose setup to run the worker instead of the Django server.links:and thevolumes:that hide the image's code).