I want to check basic Celery functionality inside official demo Celery project for django with code available here.
My modification is the demoapp/views.py (and of course urls.py to point to this view):
from __future__ import absolute_import, unicode_literals
from django.http import HttpResponse
from demoapp.tasks import add, mul, xsum
def home(request):
return HttpResponse("Your output is: %s" % mul.delay(22,4).get(timeout=1) )
which always gives timeout error, though the terminal of running Celery worker shows that task was received and displays correct return value.
However, if I start python shell python ./manage.py shell and then run
from demoapp.tasks import add, mul, xsum
mul.delay(22,4).get(timeout=1)
I immediately get the expected result. What may be the problem?
Rabbitmq server is running, I use Celery 4.2.1, django 2.0.6.
demoapp/tasks.py:
from __future__ import absolute_import, unicode_literals
from celery import shared_task
@shared_task
def add(x, y):
return x + y
@shared_task
def mul(x, y):
return x * y
@shared_task
def xsum(numbers):
return sum(numbers)
proj/celery.py:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
app = Celery('proj')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
proj/settings.py
...
CELERY_BROKER_URL = 'amqp://guest:guest@localhost//'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_RESULT_BACKEND = 'db+sqlite:///results.sqlite'
CELERY_TASK_SERIALIZER = 'json'
...