i am testing out celery to be used as multi-processing cluster application, however i am looking for some hints and whether this is the correct way to do or not...
i am using
python 2.6.6 celery-3.1.7 Django14-1.4.8-1.el6.noarch on Centos 6.4
I have setup two celery projects ie. non-django and django...
non-django project directory
/usr/local/proj
├── celery.py
├── celery.pyc
├── __init__.py
├── __init__.pyc
├── tasks.py
└── tasks.pyc
Django project directory
/usr/local/proj.django/
├── demoapp
│ ├── __init__.py
│ ├── models.py
│ ├── tasks.py
│ ├── tests.py
│ └── views.py
├── django.wsgi
├── manage.py
└── proj
├── celery.py
├── __init__.py
├── __init__.pyc
├── settings.py
├── settings.pyc
├── urls.py
├── urls.pyc
├── wsgi.py
└── wsgi.pyc
the non-django project is mounted as NFS on the celery servers and I am able to submit tasks from tasks.py and I can check the status on celery Flower.
however, i am confused how to use the same via celery django project.
questions:
- Do i need to share the /usr/local/proj.django/proj via NFS to all celery nodes?
- Is it possible that Django project uses the non-django celery project for all celery tasks.
I might be sounding like stupid here, so apologizes....
any advice would be appreciated...