7

I have a Django application where I defined a few @task functions under task.py to execute at given periodic task. I'm 100% sure that the issue is not caused by task.py or any code related but due to some configuration may be in settings.py or my celery worker.

Task does execute at periodic task but at multiple times.

Here are the celery worker logs:

celery -A cimexmonitor worker --loglevel=info -B -c 4

[2019-09-19 21:22:16,360: INFO/ForkPoolWorker-5] Project Monitor Started : APPProject1
[2019-09-19 21:22:16,361: INFO/ForkPoolWorker-4] Project Monitor Started : APPProject1
[2019-09-19 21:25:22,108: INFO/ForkPoolWorker-4] Project Monitor DONE : APPProject1
[2019-09-19 21:25:45,255: INFO/ForkPoolWorker-5] Project Monitor DONE : APPProject1
[2019-09-20 00:22:16,395: INFO/ForkPoolWorker-4] Project Monitor Started : APPProject2
[2019-09-20 00:22:16,398: INFO/ForkPoolWorker-5] Project Monitor Started : APPProject2
[2019-09-20 01:22:11,554: INFO/ForkPoolWorker-5] Project Monitor DONE : APPProject2
[2019-09-20 01:22:12,047: INFO/ForkPoolWorker-4] Project Monitor DONE : APPProject2
  • If you check above time interval, tasks.py executes one task but 2 workers of celery takes the task & executes the same task at the same interval. I'm not sure why 2 workers took for one task?

  • settings.py

..
..
# Internationalization
# https://docs.djangoproject.com/en/2.1/topics/i18n/

LANGUAGE_CODE = 'en-us'

TIME_ZONE = 'Asia/Kolkata'

USE_I18N = True

USE_L10N = True

USE_TZ = True
..
..
..
######## CELERY : CONFIG
CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_ENABLE_UTC = True
CELERYBEAT_SCHEDULER = 'django_celery_beat.schedulers:DatabaseScheduler'
  • celery.py
from __future__ import absolute_import, unicode_literals
from celery import Celery 
import os
from django.conf import settings

os.environ.setdefault('DJANGO_SETTINGS_MODULE','cimexmonitor.settings')
## set the default Django settings module for the 'celery' program.

# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.

app = Celery('cimexmonitor')
#app.config_from_object('django.conf:settings', namespace='CELERY') 
app.config_from_object('django.conf:settings')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks(settings.INSTALLED_APPS)

@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))
  • Other information:
→ celery --version
4.3.0 (rhubarb)

→ redis-server --version
Redis server v=3.0.6 sha=00000000:0 malloc=jemalloc-3.6.0 bits=64 build=7785291a3d2152db

django-admin-interface==0.9.2
django-celery-beat==1.5.0
  • Please help me the ways to debug the problem:

Thanks

4
  • could you test if your celery.py file is being loaded more than one time in your application? Commented Sep 20, 2019 at 14:05
  • Thanks for the response,How do i verify if celery.py is being loaded multiple times. Commented Sep 20, 2019 at 14:15
  • What command are you using to run celerybeat? Commented Sep 20, 2019 at 15:11
  • celery -A appname beat --loglevel=debug --scheduler django_celery_beat.schedulers:DatabaseScheduler for scheduler & celery -A appname worker --loglevel=info -B -c 4 for worker. Commented Sep 20, 2019 at 16:18

2 Answers 2

2

Both the worker and beat services need to be running at the same time to execute periodically task as per https://github.com/celery/django-celery-beat

  • WORKER:
 $ celery -A [project-name] worker --loglevel=info -B -c 5
  • Django scheduler:
celery -A [project-name] beat -l info --scheduler django_celery_beat.schedulers:DatabaseScheduler
  • I was running both worker,database scheduler at same time, which was said as per the documentation, Which was causing the issues to be executed at the same time, I'm really not sure how celery worker started working as a DB scheduler at the same time.
  • just running celery worker solved my problem.
Sign up to request clarification or add additional context in comments.

1 Comment

Does not your solution break Celery Beat?
1

From the official documentation: Ensuring a task is only executed one at a time.

Also, I hope you are not running multiple workers the same way (celery -A cimexmonitor worker --loglevel=info -B -c 4) as that would mean you have multiple celery beats scheduling tasks to run... In short - make sure you only have one Celery beat running!

2 Comments

That's a different case.
Second part of my answer looks the same as your accepted answer... We could only guess what was the reason for your problem and it turns out I guessed right...

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.