0

In my django application I am using celery. In a post_save signal, I am updating the index in elastic search. But for some reason the task gets hung and never actually executes the code:

What I use to run celery:

celery -A collegeapp worker -l info

The Signal:

@receiver(post_save, sender=University)
def university_saved(sender, instance, created, **kwargs):
    """
    University save signal
    """
    print('calling celery task')
    update_university_index.delay(instance.id)
    print('finished')

The task:

@task(name="update_university_index", bind=True, default_retry_delay=5, max_retries=1, acks_late=True)
def update_university_index(instance_id):
    print('updating university index')

The only output I get is calling celery task. after waiting over 30 minutes, it doesn't ever get to any other print statements and the view continue to wait. Nothing ever shows in celery terminal.

Versions: Django 3.0, Celery 4.3, Redis 5.0.9, Ubuntu 18

UPDATE: after doing some testing, using the debug_task defined inside the celery.py file in place of update_university_index does not lead to hanging. It behaves as expect. I thought maybe it could have been app.task vs task decorator but it seems that's not it.

@app.task(bind=True)
def debug_task(text, second_value):
    print('printing  debug_task {} {}'.format(text, second_value))
6
  • What's your celery version and OS? Commented Jun 28, 2020 at 15:56
  • 4.3 and ubuntu. I'll include them in the question. Commented Jun 28, 2020 at 16:04
  • I assume you have the worker running in paralell? Commented Jun 28, 2020 at 16:11
  • I'm not entirely sure what you mean by that. Commented Jun 28, 2020 at 16:14
  • Sorry, you have your application running and your worker on a separate terminal at the same time? Commented Jun 28, 2020 at 17:02

2 Answers 2

4

I'm still not sure as to why it doesn't work but I found a solution by replace task with app.task

importing app from my celery.py seemed to have resolved the issue.

from collegeapp.celery import app

@app.task(name="update_university_index", bind=True, default_retry_delay=5, max_retries=1, acks_late=True)
def update_university_index(self, instance_id):
    print('updating university index')
Sign up to request clarification or add additional context in comments.

Comments

1

This happened with me once, I had made the dumbest error, django tells us to specify celery tasks in tasks.py file, and uses that for task discovery. After that it worked. Could you provide more insight into the directory structure using tree command?

This tutorial is for flask, but the same can be achieved in django. Where this particular tutorial shines is that after you tell celery to execute a task, it also provides you with a uuid and you can ping that url and monitor the progress of the task you triggered.

Verify that the tasks have been registered by celery using (Do make sure that celery is running):

from celery.task.control import inspect
i = inspect()
i.registered_tasks()

Or bash

$ celery inspect registered
$ celery -A collegeapp inspect registered

From https://docs.celeryproject.org/en/latest/faq.html#the-worker-isn-t-doing-anything-just-hanging

Why is Task.delay/apply*/the worker just hanging?

Answer: There’s a bug in some AMQP clients that’ll make it hang if it’s not able to authenticate the current user, the password doesn’t match or the user doesn’t have access to the virtual host specified. Be sure to check your broker logs (for RabbitMQ that’s /var/log/rabbitmq/rabbit.log on most systems), it usually contains a message describing the reason.

Change this line

@task(name="update_university_index", bind=True, default_retry_delay=5, max_retries=1, acks_late=True)
def update_university_index(instance_id):
    print('updating university index')

To

@task(name="update_university_index", bind=True, default_retry_delay=5, max_retries=1, acks_late=True)
def update_university_index(self, instance_id):
    print('updating university index')

Or add self to the task definition.

16 Comments

I'll look into the link. When I initially run celery, a list of all the tasks are generated. The task is there so I do think its being discovered. The tree structure is: collegeapp/university/tasks.py
Could you verify by using the following while celery is running, I've updated the snippet to achieve that.
BTW Are you using docker for orchestration? Or running docker and celery separately?
I tried but it seems that celery.task does not have control. Maybe you're using an older version and it got deprecated?
I am not using docker.
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.