7

I've deployed a django(1.10) + celery(4.x) on the same VM, with rabbitmq being the broker(on the same machine). I want to develop the same application on a multi-node architecture like I can just replicate a number of worker nodes, and scale the tasks to run quickly. Here,

  1. How to configure celery with rabbitmq for this architecture?
  2. On the other worker nodes, what should be the setup?
1
  • 1
    ChillarAnand's solution should've solved it for you. You need some thing else to be clarified? Commented Feb 14, 2017 at 18:05

1 Answer 1

11
+50

You should have borker in one node and configure it so that, workers from other nodes can access it.

For that, you can create a new user/vhost on rabbitmq.

# add new user
sudo rabbitmqctl add_user <user> <password>

# add new virtual host
sudo rabbitmqctl add_vhost <vhost_name>

# set permissions for user on vhost
sudo rabbitmqctl set_permissions -p <vhost_name> <user> ".*" ".*" ".*"

# restart rabbit
sudo rabbitmqctl restart

From other nodes, you can queue up tasks or you can just run workers to consume tasks.

from celery import Celery

app = Celery('tasks', backend='amqp',
broker='amqp://<user>:<password>@<ip>/<vhost>')

def add(x, y):
    return x + y

If you have a file(say task.py) like this, you can queue up tasks using add.delay().

You can also start worker with

celery worker -A task -l info

You can see my answer here to get a brief idea about how to run tasks on remote machines. For a step by step process, you can checkout a post i have written on scaling celery.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.