7

I have 3 Separate Django Projects sharing same DB running on the same Machine. What I require is to configure Celery For them. Now my question is:

1.) Whether should I run separate celery daemons for separate projects, and set different vhosts and users in rabbitmq which I don't want to opt as it would be a waste of resources or

2.) Is there a way I can target all the tasks from different projects to a single celery server.

Also, How handy would supervisord be in the solution?

1 Answer 1

2

Yes, you can use same celery server to receive task from seperate projects.

Have a seperate celery app(or just a single file) say foo which has all tasks which are used in different projects.

# foo.py    
from celery import Celery

app = Celery(broker='amqp://guest@localhost//')

@app.task
def add(x, y):
    return x + y

@app.task
def sub(x, y):
    return x - y

Start a worker to run tasks

celery worker -l info -A foo

Now from Project A, you can call add

import celery

celery.current_app.send_task('foo.add', args=(1, 2))

And from Project B, you can call sub

import celery

celery.current_app.send_task('foo.sub', args=(1, 2))

You can use supervisord, to manage celery worker.

This approach might be slightly harder for testing as send_task won't respect CELERY_ALWAYS_EAGER. However you can use this snippet so that CELERY_ALWAYS_EAGER will be honored by send_task.

Sign up to request clarification or add additional context in comments.

2 Comments

And how do Project A and Project B know about how to connect to the broker? Is there a piece missing, or do I miss something?
Maybe this was my missing piece

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.