1

i am testing out celery to be used as multi-processing cluster application, however i am looking for some hints and whether this is the correct way to do or not...

i am using

python 2.6.6 celery-3.1.7 Django14-1.4.8-1.el6.noarch on Centos 6.4

I have setup two celery projects ie. non-django and django...

non-django project directory

/usr/local/proj
├── celery.py
├── celery.pyc
├── __init__.py
├── __init__.pyc
├── tasks.py
└── tasks.pyc

Django project directory

/usr/local/proj.django/
├── demoapp
│   ├── __init__.py
│   ├── models.py
│   ├── tasks.py
│   ├── tests.py
│   └── views.py
├── django.wsgi
├── manage.py
└── proj
    ├── celery.py
    ├── __init__.py
    ├── __init__.pyc
    ├── settings.py
    ├── settings.pyc
    ├── urls.py
    ├── urls.pyc
    ├── wsgi.py
    └── wsgi.pyc

the non-django project is mounted as NFS on the celery servers and I am able to submit tasks from tasks.py and I can check the status on celery Flower.

however, i am confused how to use the same via celery django project.

questions:

  • Do i need to share the /usr/local/proj.django/proj via NFS to all celery nodes?
  • Is it possible that Django project uses the non-django celery project for all celery tasks.

I might be sounding like stupid here, so apologizes....

any advice would be appreciated...

1 Answer 1

1

there is absolutely no need to mount anything, you should be able to run celery workers individually listening to the queue.

if your tasks have to use django code as I understand they do, consider replicating your code in all celery nodes, for example, all nodes should be pulling from the same git repository preferably even use a tool like fabric

one thing to be aware of is to enable remote access to you database (i.e. mysql) from all nodes

EDIT: by "no need to mount" I mean have all needed tools installed on all nodes, i.e pip install django celery ... and have your code pulled from central repo (i.e. git) as well, depending on your hosting, maybe even replicate the machine itself, have identical machines all listening to the central queue

Edit2:

Do i need to share the /usr/local/proj.django/proj via NFS to all celery nodes?

I guess it can work, though a better practice would be to pull from a central repo, see above

Is it possible that Django project uses the non-django celery project for all celery tasks.

the question is where do you run celery worker and with which parameters? see celery workers guide

as a side note consider installing / enabling rabbitmq management plugin to see what exactly is going on

Sign up to request clarification or add additional context in comments.

3 Comments

when you mean "no need to mount anything", are you saying that the celery project need not be on all the nodes? and that the celery daemon will pick up the task from rabbitmq.....
Hi, maybe you could clarify why you are using an NFS mounting point, is just to share the code between the different servers ?
thanks for the clarfication..I used NFS here to share the code across servers...i will use git later on the nodes, however i want to make django working first with celery....so what i want to know is whether i need to install Django on all the nodes along with the celery app on all server?

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.