0

I have an application made it with django using redis and celery for some asynchronous tasks. I am using the celery tasks to execute some stored procedures. This SP take from 5 mins to 30 mins to execute completely(depending in amount of records). Everything works great. But I need the be able to execute the tasks several times. but right now when I run task and another user run the task too, the two tasks are executed at the same time. I need the task enter in queue and only executed when the first task finish. My settings.py:

BROKER_URL = 'redis://localhost:6379/0'
CELERY_IMPORTS = ("pc.tasks", )
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_RESULT_BACKEND='djcelery.backends.cache:CacheBackend'

tasks.py

from __future__ import absolute_import
from celery.decorators import task
from celery import task, shared_task
from .models import Servicio, Proveedor, Lectura_FTP, Actualizar_Descarga
from .models import Lista_Archivos, Lista_Final, Buscar_Conci

@task
def carga_ftp():
    tabla = Proc_Carga()
    sp = tabla.carga()
    return None

@task
def conci(idprov,pfecha):
    conci = Buscar_Conci()
    spconc = conci.buscarcon(idprov,pfecha)

I call the tasks in my view in this way:

conci.delay(prov,DateV);

How can I create or setup a queue list of taks and everry tasks is executed only when the previous taks is finished

Thanks in advance

2 Answers 2

0

You can limit the tasks workers, in your cause i assume you just need one worker at time, so just start one worker when calling the djcelery.

python manage.py celery worker -B --concurrency=1
Sign up to request clarification or add additional context in comments.

1 Comment

I only have a doubt. i can execute diferent tasks at the same time limiting the workers. Only for the task "conci" I need to have a queue, for the other task it doesn't matter call the task several times
0

You can use lock, for example (from one of my projects):

def send_queued_emails(*args, **kwargs):
  from mailer.models import Message
  my_lock = redis.Redis().lock("send_mail")

  try:
    have_lock = my_lock.acquire(blocking=False)
    if have_lock:
        logging.info("send_mail lock ACQUIRED")
        from celery import group

        if Message.objects.non_deferred().all().count() > 0:
            t = EmailSenderTask()
            g = (group(t.s(message=msg) for msg in Message.objects.non_deferred().all()[:200]) | release_redis_lock.s(lock_name="send_mail"))
            g()
        else:
            logging.info("send_mail lock RELEASED")
            my_lock.release()
    else:
        logging.info("send_mail lock NOT ACQUIRED")

  except redis.ResponseError as e:
        logging.error("Redis throw exception : {}".format(e))
  except:
    my_lock.release()
    logging.error("send_mail lock RELEASED because of exception")

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.