157

The Celery documentation mentions testing Celery within Django but doesn't explain how to test a Celery task if you are not using Django. How do you do this?

0

11 Answers 11

90

It is possible to test tasks synchronously using any unittest lib out there. I normally do 2 different test sessions when working with celery tasks. The first one (as I'm suggesting below) is completely synchronous and should be the one that makes sure the algorithm does what it should do. The second session uses the whole system (including the broker) and makes sure I'm not having serialization issues or any other distribution, communication problem.

So:

from celery import Celery

celery = Celery()

@celery.task
def add(x, y):
    return x + y

And your test:

from nose.tools import eq_

def test_add_task():
    rst = add.apply(args=(4, 4)).get()
    eq_(rst, 8)
Sign up to request clarification or add additional context in comments.

4 Comments

That works except on tasks which use a HttpDispatchTask - docs.celeryproject.org/en/latest/userguide/remote-tasks.html where I have to set celery.conf.CELERY_ALWAYS_EAGER = True but even with also setting celery.conf.CELERY_IMPORTS = ('celery.task.http') the test fails with NotRegistered: celery.task.http.HttpDispatchTask
Weird, are you sure you're not having some import issues? This test works (note that I'm faking the response so it returns what celery expects). Also, modules defined in CELERY_IMPORTS will be imported during the workers initialization, in order to avoid this I suggest you to call celery.loader.import_default_modules().
I would also suggest you to take a look here. It mocks the http request. Dunno know if it helps, I guess you want to test a service that is up and running, don't you?
What's the difference/advantage of using task.appl().get() and using the eager flag? Thanks
73

Here is an update to my seven years old answer:

You can run a worker in a separate thread via a pytest fixture:

https://docs.celeryq.dev/en/v5.2.6/userguide/testing.html#celery-worker-embed-live-worker

According to the docs, you should not use "always_eager" (see the top of the page of the above link).


Old answer:

I use this:

with mock.patch('celeryconfig.CELERY_ALWAYS_EAGER', True, create=True):
        ...

Docs: https://docs.celeryq.dev/en/3.1/configuration.html#celery-always-eager

CELERY_ALWAYS_EAGER lets you run your task synchronously, and you don't need a celery server.

9 Comments

I think this is outdated - I get ImportError: No module named celeryconfig.
I believe above assumes the module celeryconfig.py exists in one's package. See docs.celeryproject.org/en/latest/getting-started/….
I know it's old but can you provide a a full example for how to launch tasks add from OP's question within a TestCase class?
@miken32 thanks. As the most recent answer somehow tackles the problem I wanted to help with, I just left a comment that the official docs for 4.0 discourages use of CELERY_TASK_ALWAYS_EAGER for unit testing.
Why do the docs discourage the use of CELERY_TASK_ALWAYS_EAGER for tests? There is no explanation and I don't see the logic of this recommendation.
|
43

Depends on what exactly you want to be testing.

  • Test the task code directly. Don't call "task.delay(...)" just call "task(...)" from your unit tests.
  • Use CELERY_ALWAYS_EAGER. This will cause your tasks to be called immediately at the point you say "task.delay(...)", so you can test the whole path (but not any asynchronous behavior).

Comments

42

For those on Celery 4 it's:

@override_settings(CELERY_TASK_ALWAYS_EAGER=True)

Because the settings names have been changed and need updating if you choose to upgrade, see

https://docs.celeryproject.org/en/latest/history/whatsnew-4.0.html?highlight=what%20is%20new#lowercase-setting-names

2 Comments

According to the official docs, use of "task_always_eager" (earlier "CELERY_ALWAYS_EAGER") is not suitable for unit testing. Instead they propose some other, great ways to unit test your Celery app.
I'll just add that the reason why you don't want eager tasks in your unit tests is because then you're not testing e.g. the serialization of parameters that will happen once you are using the code in production.
40

unittest

import unittest

from myproject.myapp import celeryapp

class TestMyCeleryWorker(unittest.TestCase):

  def setUp(self):
      celeryapp.conf.update(CELERY_ALWAYS_EAGER=True)

py.test fixtures

# conftest.py
from myproject.myapp import celeryapp

@pytest.fixture(scope='module')
def celery_app(request):
    celeryapp.conf.update(CELERY_ALWAYS_EAGER=True)
    return celeryapp

# test_tasks.py
def test_some_task(celery_app):
    ...

Addendum: make send_task respect eager

from celery import current_app

def send_task(name, args=(), kwargs={}, **opts):
    # https://github.com/celery/celery/issues/581
    task = current_app.tasks[name]
    return task.apply(args, kwargs, **opts)

current_app.send_task = send_task

Comments

19

As of Celery 3.0, one way to set CELERY_ALWAYS_EAGER in Django is:

from django.test import TestCase, override_settings

from .foo import foo_celery_task

class MyTest(TestCase):

    @override_settings(CELERY_ALWAYS_EAGER=True)
    def test_foo(self):
        self.assertTrue(foo_celery_task.delay())

1 Comment

it seems it does not work when the celery task is inside a function.
15

Since Celery v4.0, py.test fixtures are provided to start a celery worker just for the test and are shut down when done:

def test_myfunc_is_executed(celery_session_worker):
    # celery_session_worker: <Worker: [email protected] (running)>
    assert myfunc.delay().wait(3)

Among other fixtures described on http://docs.celeryproject.org/en/latest/userguide/testing.html#py-test, you can change the celery default options by redefining the celery_config fixture this way:

@pytest.fixture(scope='session')
def celery_config():
    return {
        'accept_content': ['json', 'pickle'],
        'result_serializer': 'pickle',
    }

By default, the test worker uses an in-memory broker and result backend. No need to use a local Redis or RabbitMQ if not testing specific features.

3 Comments

Dear downvoter, would you like to share why is this a bad answer? Sincerely thanks.
Didn't work for me, the test suite just hangs. Could you provide some more context? (I didn't vote yet though ;) ).
In my case I had to explicitly set the celey_config fixture to use the memory broker and the cache+memory backend
14

reference using pytest.

def test_add(celery_worker):
    mytask.delay()

if you use flask, set the app config

    CELERY_BROKER_URL = 'memory://'
    CELERY_RESULT_BACKEND = 'cache+memory://'

and in conftest.py

@pytest.fixture
def app():
    yield app   # Your actual Flask application

@pytest.fixture
def celery_app(app):
    from celery.contrib.testing import tasks   # need it
    yield celery_app    # Your actual Flask-Celery application

3 Comments

Any idea this app config with memory and cache would also work with django.test ?
I read everything on the internet but this is the only thing which shed some light in the tunnel for me to call a chain of tasks with the Unittest library. I also used task_always_eager (and I know it is not the best) but I only wanted to test the error handling in a chain.
2022 using celery==5.2.3 and this was teh only thing that actually worked for me in addition to setting CELERY_SETTINGS = {"broker_url": "memory://", "result_backend": "cache+memory://", "task_always_eager": True}
6

In my case (and I assume many others), all I wanted was to test the inner logic of a task using pytest.

TL;DR; ended up mocking everything away (OPTION 2)


Example Use Case:

proj/tasks.py

@shared_task(bind=True)
def add_task(self, a, b):
    return a+b;

tests/test_tasks.py

from proj import add_task

def test_add():
    assert add_task(1, 2) == 3, '1 + 2 should equal 3'

but, since shared_task decorator does a lot of celery internal logic, it isn't really a unit tests.

So, for me, there were 2 options:

OPTION 1: Separate internal logic

proj/tasks_logic.py

def internal_add(a, b):
    return a + b;

proj/tasks.py

from .tasks_logic import internal_add

@shared_task(bind=True)
def add_task(self, a, b):
    return internal_add(a, b);

This looks very odd, and other than making it less readable, it requires to manually extract and pass attributes that are part of the request, for instance the task_id in case you need it, which make the logic less pure.

OPTION 2: mocks
mocking away celery internals

tests/__init__.py

# noinspection PyUnresolvedReferences
from celery import shared_task

from mock import patch


def mock_signature(**kwargs):
    return {}


def mocked_shared_task(*decorator_args, **decorator_kwargs):
    def mocked_shared_decorator(func):
        func.signature = func.si = func.s = mock_signature
        return func

    return mocked_shared_decorator

patch('celery.shared_task', mocked_shared_task).start()

which then allows me to mock the request object (again, in case you need things from the request, like the id, or the retries counter.

tests/test_tasks.py

from proj import add_task

class MockedRequest:
    def __init__(self, id=None):
        self.id = id or 1


class MockedTask:
    def __init__(self, id=None):
        self.request = MockedRequest(id=id)


def test_add():
    mocked_task = MockedTask(id=3)
    assert add_task(mocked_task, 1, 2) == 3, '1 + 2 should equal 3'

This solution is much more manual, but, it gives me the control I need to actually unit test, without repeating myself, and without losing the celery scope.

Comments

2

I see a lot of CELERY_ALWAYS_EAGER = true in unit tests methods as a solution for unit tests, but since the version 5.0.5 is available there are a lot of changes which makes most of the old answers deprecated and for me a time consuming nonsense, so for everyone here searching a Solution, go to the Doc and read the well documented unit test examples for the new Version:

https://docs.celeryproject.org/en/stable/userguide/testing.html

And to the Eager Mode with Unit Tests, here a quote from the actual docs:

Eager mode

The eager mode enabled by the task_always_eager setting is by definition not suitable for unit tests.

When testing with eager mode you are only testing an emulation of what happens in a worker, and there are many discrepancies between the emulation and what happens in reality.

1 Comment

The docs seems to be just for pytest, not unittest, which is default for django. It would be cool if they had some example of using the standard django testing setup.
-1

Another option is to mock the task if you do not need the side effects of running it.

from unittest import mock


@mock.patch('module.module.task')
def test_name(self, mock_task): ...

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.