I have some "heavy" requests to the database that I'm going to execute using Celery. Taking into account that they are "heavy" I want to execute them sequentially (one by one). One possible solution is to specify --concurrency=1 in the command line to Celery. And it works. But there is a problem: while the tasks are being executed all the following requests return None:
from celery.task.control import inspect
# Inspect all nodes.
i = inspect()
print(i.scheduled()) # None
print(i.active()) # None
print(i.reserved()) # None
print(i.registered()) # None
Also, running celery inspect ping returns Error: No nodes replied within time constraint. So that I can't receive any information on the Celery queue state.
There are my test python modules:
celeryconfig.py
#BROKER_URL = 'redis://localhost:6379/0'
BROKER_URL = 'amqp://'
#CELERY_RESULT_BACKEND = "redis"
CELERY_RESULT_BACKEND = "amqp://"
# for php
CELERY_TASK_RESULT_EXPIRES = None
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_ACKS_LATE = True
tasks.py
from celery import Celery
from time import sleep
app = Celery('hello')
app.config_from_object('celeryconfig')
@app.task
def add(x, y):
sleep(30)
return x + y
client.py
from tasks import add
result=add.delay(4, 4)
result=add.delay(4, 4)
result=add.delay(4, 4)
result=add.delay(4, 4)
result=add.delay(4, 4)
result=add.delay(4, 4)
So, the question is, how to run the tasks one by one AND be able to check the state of the queue?