0

I've a python big data project in which i'm trying to do my tasks using celery and Redis server. The thing is i need three different queue's for three different tasks which i'm applying to celery to do.

This is the configuration i've made to run the three tasks at once but they are using one single queue to perform the task one after the other so it's taking so much time.

from celery import Celery
from celery.utils.log import get_task_logger

logger = get_task_logger(__name__)

app = Celery("tasks", broker="redis://localhost:6379")

@app.task()
def main(capital,gamma,alpha):

@app.task()
def gain(capital,gamma,alpha):

@app.task()
def lain(capital,gamma,alpha):

And to start the celery app i'm using this lines of code

("celery -A task worker --loglevel=info -P eventlet --concurrency=10 -n worker1@%h" , shell=True)

("celery -A task worker --loglevel=info -P eventlet --concurrency=10 -n worker2@%h" , shell=True)

("celery -A task worker --loglevel=info -P eventlet --concurrency=10 -n worker3@%h" , shell=True)

These code perfectly runs my app with three tasks but i need to make three independent queue's for each of these tasks so that all the three tasks can run concurrently or parallelly at once.

This is how my celery app looks like with three tasks but it's using only one queue which is default queue named celery.

 -------------- worker1@EC2AMAZ-RTM8UD8 v5.2.3 (dawn-chorus)
--- ***** -----
-- ******* ---- Windows-10-10.0.20348-SP0 2022-03-10 12:59:07
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app:         tasks:0x248adf81e80
- ** ---------- .> transport:   redis://localhost:6379//
- ** ---------- .> results:     disabled://
- *** --- * --- .> concurrency: 10 (eventlet)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery


[tasks]
  . task.gain
  . task.lain
  . task.main

[2022-03-10 12:59:07,461: INFO/MainProcess] Connected to redis://localhost:6379//
[2022-03-10 12:59:07,477: INFO/MainProcess] mingle: searching for neighbors
[2022-03-10 12:59:08,493: INFO/MainProcess] mingle: all alone
[2022-03-10 12:59:08,493: INFO/MainProcess] pidbox: Connected to redis://localhost:6379//.
[2022-03-10 12:59:08,493: INFO/MainProcess] worker1@EC2AMAZ-RTM8UD8 ready.

So please can anyone help me defining three different queue's for three different tasks in celery app and any help would be very much appreciated :)

1 Answer 1

2

It should be as simple as adding -Q <queue name> to those three Celery workers that you run.

Sign up to request clarification or add additional context in comments.

2 Comments

Hey @DejanLekic i'm aware of how to call the queue's but don't know how to initiate those queue's before calling them in the app.
If you use all the defaults, there is no need to initiate them. - They will be automatically created for you by Celery. Also, you do not "call" queue. You send particular task to the queue of your choice.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.