1

I have divided celery into following parts

  1. Celery
  2. Celery worker
  3. Celery daemon
  4. Broker: Rabbimq or SQS
  5. Queue
  6. Result backend
  7. Celery monitor (Flower)

My Understanding

  1. When i hit celery task in django e,g tasks.add(1,2). Then celery adds that task to queue. I am confused if thats 4 or 5 in above list
  2. WHen task goes to queue Then worker gets that task and delete from queue
  3. The result of that task is saved in Result Backend

My Confusions

  1. Whats diff between celery daemon and celery worker
  2. Is Rabbitmq doing the work of queue. Does it means tasks gets saved in Rabitmq or SQS
  3. What does flower do . does it monitor workers or tasks or queues or resulst

2 Answers 2

0

First, just to explain how it works briefly. You have a celery client running in your code. You call tasks.add(1,2) and a new Celery Task is created. That task is transferred by the Broker to the queue. Yes the queue is persisted in Rabbimq or SQS. The Celery Daemon is always running and is listening for new tasks. When there is a new task in the queue, it starts a new Celery Worker to perform the work.

To answer your questions:

  1. Celery daemon is always running and it's starting celery workers.

  2. Yes Rabitmq or SQS is doing the work of a queue.

  3. With the celery monitor you can monitor how many tasks are running, how many are completed, what is the size of the queue, etc.

Sign up to request clarification or add additional context in comments.

Comments

0

I think the answer from nstoitsev has good intention but create some confusion. So let's try to clarify a bit.

  • A Celery worker is the celery process responsable of executing the tasks, when configured to run in background than is often called celery daemon. So you can consider the two the same thing. To clarify the confusion of he answer of nstoitsev, each worker can have a concurrency parameter that can be bigger than 1. When this is the case each celery worker is capable of create N child worker till reaching the concurrency parameter to execute the task in parallel, this are often also called worker.
  • The broker holds queues and exchanges this means that a celery worker is able to connect to to the broker using a protocol called AMQP and publish or consume messages.
  • Flower is able to monitor a celery cluster using the broker itself. Basically is capable to receive events from all the workers. Flower works also if you have the Result Backend disabled that btw is default behavior with celery Celery result backend.

Hope this helps.

2 Comments

Is possible to manually retry the task which is failed or move that failed task to another queue to reshulde later
Hi, this would be another question and stack overflow rules are clear about not using comments as a chat. Anyway this doc page could help with your question. docs.celeryproject.org/en/latest/userguide/…

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.