5

I am using Celery for a POC. My objective is to create a workflow across a distributed system.

For the purpose of a POC I have created a couple of docker containers each with a worker started with the queue option. I initiated a chain from yet another container. The chain executes successfully across the containers. Now I decided to enable the result_backend to store the results of each task getting executed.

I set the result_backend to postgresql with the schema option. After executing the chain now, I do see the tables created in postgres. However the task_meta table has some columns as null (e.g. the worker, queue.) Where as the task_id and status are correctly populated.

Has anyone faced a similar issue? Any help is appreciated.

2
  • Could it be that there were no workers to receive the task, and it still got the default values? Commented Aug 27, 2020 at 14:52
  • Actually both the tasks in the chain got executed - with result of one passed to the other. I was unable to check what are the values for the args in the DB it being a bytea type. Commented Aug 27, 2020 at 15:02

1 Answer 1

7

I know it has been a long time since you asked the question but I hope it will help others.

By default, Celery does not write all task result attributes to the backend. Your have to configure it by setting result_extended to True as stated here: https://docs.celeryproject.org/en/stable/userguide/configuration.html#result-extended

So if you configure your app from python you will have to do something like:

import celery

app = Celery(
    broker=broker_url,
    backend=postgresq_url,
)

app.conf.update(result_extended=True)
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.