I am integrating the Celery 4 task queue into my Pyramid web server. The documentation seems a little sparse though. I found a somewhat dated pyramic_celery module that’s supposed to handle Pyramid’s .ini files, and to make them usable to configure Celery. There is also an even older blog entry which uses a similar idea but a different Celery signal.
After some experimenting I found a solution which seems much simpler (i.e. doesn’t require any of the Celery signals), it works, and it even spins up the Celery Worker inspired by this SO question.
I am curious to hear people’s thoughts on this implementation. In particular, I am curious to find out if spawning the Celery worker as a sub-process of Pyramid is considered “good” or “bad” practice.
The Pyramid server’s development.ini file contains
pyramid.includes =
pyramid_tm
srv.celery
where srv.celery is a module and part of the Pyramid server. This module’s __init__.py file contains all the magic:
import multiprocessing
import celery
def includeme(config):
pass
# Configuration settings for the Celery instance.
_celery_config = dict(
broker_url = "redis://localhost:6379/0",
result_backend = "redis://localhost:6379/0",
imports = "srv.celery.tasks",
# More configuration can go here.
)
# Instantiate Celery.
celery_app = celery.Celery("my-websrv")
celery_app.conf.update(**_celery_config)
# Define a class that wraps the Celery worker process.
class CeleryWorkerProcess(multiprocessing.Process):
def __init__(self):
super().__init__(name='celery_worker_process')
# This function is called when the process is started, thus
# creating a new child-process of the web-server that runs
# the Celery worker.
def run(self):
argv = [
'worker',
'--loglevel=info',
'--quiet',
]
celery_app.worker_main(argv)
# Create the Celery worker process and run it.
_celery_worker = CeleryWorkerProcess()
_celery_worker.start()
print("Started Celery", celery_app, "and worker in PID", celery_worker.pid)
This solution does not consider Pyramid’s .ini file configuration, but maybe that’s just ok. This solution does not work in a daemonic setup.
When I pserve the project, both the web server and the Celery worker process spin up and respond. For now, the only task is the add() from the basic example:
Started Celery <Celery my-websrv:0x10f6f20f0> and worker in PID 51926
Starting server in PID 51923.
Serving on http://0.0.0.0:6543
[2016-12-04 10:00:26,796: INFO/celery_worker_process] Connected to redis://localhost:6379/0
[2016-12-04 10:00:26,805: INFO/celery_worker_process] mingle: searching for neighbors
[2016-12-04 10:00:27,825: INFO/celery_worker_process] mingle: all alone
[2016-12-04 10:00:27,840: INFO/celery_worker_process] [email protected] ready.
[2016-12-04 10:01:00,067: INFO/celery_worker_process] Received task: srv.celery.tasks.add[ddf286c9-f34c-4d70-8c49-2570a5830843]
[2016-12-04 10:01:00,075: INFO/PoolWorker-1] Task srv.celery.tasks.add[ddf286c9-f34c-4d70-8c49-2570a5830843] succeeded in 0.005526618973817676s: 4