14

I'm testing celery in a local environment. My Python file has the following two lines of code:

celery_app.send_task('tasks.test1', args=[self.id], kwargs={})
celery_app.send_task('tasks.test2', args=[self.id], kwargs={})

Looking at the console output they seem to execute one after another in sequence. But test2 only runs after test1 has finished. At least this is the way it seems reading the console output.

These tasks have no dependancies on each other so I don't want one task waiting for another to complete before moving onto the next line.

How can I execute both tasks as the same time?

---- **** -----
--- * ***  * -- Darwin-14.0.0-x86_64-i386-64bit
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app:         tasks:0x104cd8c10
- ** ---------- .> transport:   sqs://123
- ** ---------- .> results:     disabled
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ----
--- ***** ----- [queues]
 -------------- .> celery           exchange=celery(direct) key=celery
3
  • stackoverflow.com/questions/15307609/… Commented Aug 29, 2014 at 15:04
  • @ndpu so its a concurrency issue? I don't have to use parallel tasks here? Commented Aug 29, 2014 at 15:07
  • it states on start up that I have "concurrency: 4 (prefork)" Commented Aug 29, 2014 at 15:10

2 Answers 2

28

There are multiple ways to achieve this.

1. Single Worker - Single Queue.

$ celery -A my_app worker -l info  -c 2 -n my_worker

This will start a worker which executes 2 tasks at the same time.

2. Multiple workers - Single Queue.

$ celery -A my_app worker -l info  -c 1 -n my_worker1
$ celery -A my_app worker -l info  -c 1 -n my_worker2

This will start two workers which executes one task at a time. Note both tasks are in the same queue.

3. Multiple workers - Multiple Queues.

$ celery -A my_app worker -l info  -c 1 -n my_worker1 -Q queue1
$ celery -A my_app worker -l info  -c 1 -n my_worker2 -Q queue2

This will start two workers which executes one task at a time. But here you have route the tasks accordingly.

celery_app.send_task('tasks.test1', args=[self.id], kwargs={}, queue='queue1')
celery_app.send_task('tasks.test2', args=[self.id], kwargs={}, queue='queue2')

4. Single worker - All Queues

$ celery -A my_app worker -l info -n my_worker1 

If you don't mention any queue, it will consume from all queues by default.

Sign up to request clarification or add additional context in comments.

5 Comments

@user1012513 You can run celery -A my_app worker -l info -c 1 -n my_worker1 -Q queue1,queue2,queue3
If you do not mention have -Q flag, the workers by default reads from all the queues.
@AdityaNagesh Yes, you are right. Updated in answer.
@ChillarAnand, in my case even when I use -c 2 in celery worker command, one task is blocking another task. The command I am using is celery worker -A celery_app -P gevent -l info -c 2. Can you let me know what else should I change so that both tasks can be run in parallel?
@sattva_venu did you find a solution? I'm having the same issue
7

Call the worker with --autoscale option which would scale up and down processes as required.

--autoscale AUTOSCALE
                       Enable autoscaling by providing max_concurrency,
                       min_concurrency. Example:: --autoscale=10,3 (always
                       keep 3 processes, but grow to 10 if necessary)

example.

celery -A sandbox worker --autoscale=10,0 --loglevel=info 

2 Comments

Does this work on windows? I gave a try but still one celery task is blocking another celery task.
@sattva_venu Sorry I don't use Windows, no Idea.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.