3

in settings.py

CELERY_TIMEZONE = 'Europe/Minsk'
CELERY_TASK_TRACK_STARTED = True
CELERY_TASK_TIME_LIMIT = 30 * 60
CELERY_BROKER_URL = os.environ.get('CELERY_BROKER_URL')
CELERY_RESULT_BACKEND = os.environ.get('CELERY_BROKER_URL')

CELERY_BROKER_URL = redis://redis:6379

config/celery.py:

import os

from celery import Celery

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'config.settings')

app = Celery('config')

app.config_from_object('django.conf:settings', namespace='CELERY')

app.autodiscover_tasks()

app.conf.beat_schedule = {
    'pulling-games-to-database': {
        'task': 'gamehub.tasks.pull_games',
        'schedule': 604800.0,
    }
}

docker-compose.yml

version: '3'


services:
  db:
    build:
      context: ./docker/postgres
      dockerfile: Dockerfile
    env_file:
      - ./.env.db
    volumes:
      - ./docker/postgres/init.sql:/docker-entrypoint-initdb.d/init.sql
    restart: always
    ports:
      - '5432:5432'

  redis:
    image: redis
    ports:
      - '6379:6379'

  celery:
    build: .
    command: celery -A config worker -l info
    volumes:
      - .:/code
    depends_on:
      - db
      - redis

  celery-beat:
    build: .
    command: celery -A config beat -l info
    volumes:
      - .:/code
    depends_on:
      - db
      - redis

  app:
    build:
      context: ./
      dockerfile: Dockerfile
    env_file:
      - ./.env
    volumes:
      - ./:/usr/src/app
    depends_on:
      - db
      - redis
    ports:
      - '8000:8000'
    restart: always

  nginx:
    build:
      context: ./docker/nginx
      dockerfile: Dockerfile
    depends_on:
      - app
      - db
    ports:
      - '80:80'

When I run this by

sudo docker-compose build --no-cache
sudo docker-compose up

I do not see any errors. As well as I do not see celery output. My task puts data to the database periodically. This data must be shown at main page. But it does not. I'm pretty sure that database is connected because other functions work. If you need something else to be shown from my project let me know please.

3
  • app, celery and celery-beat are all using the same code and docker image but they have different volumes? Commented Oct 12, 2021 at 17:54
  • @IainShelvington Yeah, already noticed this. But refactoring this did not solved my problem :( Commented Oct 12, 2021 at 18:13
  • Please provide enough code so others can better understand or reproduce the problem. Commented Oct 14, 2021 at 7:48

1 Answer 1

1

docker-compose.yml:

  celeryworker:
    image: celeryworker
    ports: []
    command: /start-celeryworker

  celerybeat:
    image: celerybeat
    ports: []
    command: /start-celerybeat

  flower:
    image: flower
    ports:
      - "5545:5545"
    command: /start-flower

Dockerfile:

COPY ./compose/local/celery/worker/start /start-celeryworker
RUN sed -i 's/\r//' /start-celeryworker
RUN chmod +x /start-celeryworker

COPY ./compose/local/celery/beat/start /start-celerybeat
RUN sed -i 's/\r//' /start-celerybeat
RUN chmod +x /start-celerybeat

COPY ./compose/local/celery/flower/start /start-flower
RUN sed -i 's/\r//' /start-flower
RUN chmod +x /start-flower
Sign up to request clarification or add additional context in comments.

2 Comments

What is '<<: *django' ? It causes ERROR: yaml.composer.ComposerError: found undefined alias 'django'
Just remove it or replace it with your app.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.