I have the following setup:
- Django-Celery project A registers task
foo - Project B: Uses Celery's send_task to call
foo - Project A and project B have the same configuration: SQS, msgpack for serialization, gzip, etc.
- Each project lives on a different github repository
I've unit-tested calls to "foo" in project A, without using Celery at all, just foo(1,2,3) and assert the result. I know that it works.
I've unit-tested that send_task in project B sends the right parameters.
What I'm not testing, and need your advise on is the integration between the two projects. I would like to have a unittest that would:
- Start a worker in the context of project A
- Send a task using the code of project B
- Assert that the worker started in the first step gets the task, with the parameters I sent in the second step, and that the
foofunction returned the expected result.
It seems to be possible to hack this by using python's subprocess and parsing the output of the worker, but that's ugly. What's the recommended approach to unit-testing in cases like this? Any code snippet you could share? Thanks!