1

Running django-celery 3.1.16, Celery 3.1.17, Django 1.4.16. Trying to run some parallel tasks using 3 workers and collect the results using the following:

from celery import group

positions = []
jobs = group(celery_calculate_something.s(data.id) for data in a_very_big_list)
results = jobs.apply_async()
positions.extend(results.get())

The task celery_calculate_something returns an object to place the in the results list:

app.task(ignore_result=False)
def celery_calculate_something(id):
  <do stuff>

No matter what I try, I always get the same result when calling get() on results:

No result backend configured.  Please see the documentation for more information.

However, the results backend IS configured - I have many other tasks with ignore_result=False merrily adding to the tasks meta table in django_celery. It is something to do with using the results returned from group(). I should note it is not set explicitly in settings - it seems that django-celery has set it automatically for you.

I have the worker collecting events using:

manage.py celery worker -l info -E

and celerycam running with

python manage.py celerycam

Inspecting the results object returned (an instance of GroupResult) I can see that the backend attr is an instance of DisabledBackend. Is this the problem? What have I mis-understood?

1 Answer 1

0

You did not configure the results backend, so basically you need tables to store the results, since you have django-celery add it to INSTALLED_APPS in your settings.py file and then perform the migration (python manage.py migrate) After that open your celery.py file and modify your backend to djcelery.backends.database:DatabaseBackend.
Here's an example

app = Celery('almanet',
 broker='amqp://guest@localhost//',
 backend='djcelery.backends.database:DatabaseBackend',
 include=['alm_crm.tasks'] #References your tasks. Donc forget to put the whole absolute path.
 )

After that you can import results from celery import result Now you can save the result and extract the result by job.id

from celery import group
positions = []
jobs = group(celery_calculate_something.s(data.id) for data in
a_very_big_list)
results = jobs.apply_async()
results.save()
some_task_result = result.GroupResult.restore(results.id)
print some_task_results.ready()
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.