I’m trying to run Apache Airflow 3.1. with the CeleryExecutor using Docker Compose (together with Redis + Postgres).
My problem is, when I trigger a DAG (directly over the command line or how it's supposed to be via FastAPI), the DAG run gets created in the metadata DB with state queued, but no TaskInstances ever leave the queued state.
The Celery workers are running and sending heartbeats to Redis, but no tasks are dispatched to them.
Here’s the relevant part of my docker-compose.yml (shortened to Airflow/Redis/Postgres only, essentially it's https://airflow.apache.org/docs/apache-airflow/stable/docker-compose.yaml):
x-airflow-common:
&airflow-common
build: .
environment:
&airflow-common-env
AIRFLOW__API__AUTH_BACKENDS: "airflow.api.auth.backend.session"
AIRFLOW__API__WORKERS: 1
AIRFLOW__API__ACCESS_LOGFILE: '-'
AIRFLOW__API__ERROR_LOGFILE: '-'
AIRFLOW__LOGGING__LOGGING_LEVEL: DEBUG
AIRFLOW__CORE__EXECUTOR: CeleryExecutor
AIRFLOW__CORE__AUTH_MANAGER: airflow.providers.fab.auth_manager.fab_auth_manager.FabAuthManager
AIRFLOW__DATABASE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:airflow@postgres/airflow
AIRFLOW__CELERY__RESULT_BACKEND: db+postgresql://airflow:airflow@postgres/airflow
AIRFLOW__CELERY__BROKER_URL: redis://redis:6379/0
AIRFLOW__CORE__FERNET_KEY: ''
AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'true'
AIRFLOW__CORE__LOAD_EXAMPLES: 'false'
AIRFLOW__CORE__EXECUTION_API_SERVER_URL: 'http://airflow-apiserver:8080/execution/'
AIRFLOW_CONN_SUPABASE_CONN: 'postgresql://postgres:${POSTGRES_PASSWORD}@${POSTGRES_HOST}:${POSTGRES_PORT}/${POSTGRES_DB}'
AIRFLOW__LOGGING__REMOTE_LOGGING: "False"
AIRFLOW__SCHEDULER__ENABLE_HEALTH_CHECK: 'true'
_PIP_ADDITIONAL_REQUIREMENTS: ${_PIP_ADDITIONAL_REQUIREMENTS:-}
AIRFLOW_CONFIG: '/opt/airflow/config/airflow.cfg'
volumes:
- ${AIRFLOW_PROJ_DIR:-.}/dags:/opt/airflow/dags
- ${AIRFLOW_PROJ_DIR:-.}/logs:/opt/airflow/logs
- ${AIRFLOW_PROJ_DIR:-.}/config:/opt/airflow/config
- ${AIRFLOW_PROJ_DIR:-.}/plugins:/opt/airflow/plugins
- ${AIRFLOW_PROJ_DIR:-.}/outputs:/opt/airflow/outputs
- ${AIRFLOW_PROJ_DIR:-.}/inputs:/opt/airflow/inputs
services:
postgres:
image: postgres:13
environment:
POSTGRES_USER: airflow
POSTGRES_PASSWORD: airflow
POSTGRES_DB: airflow
redis:
image: redis:7.2-bookworm
expose:
- 6379
airflow-scheduler:
<<: *airflow-common
command: scheduler
airflow-worker:
<<: *airflow-common
command: celery worker
airflow-apiserver:
<<: *airflow-common
command: api-server
airflow-init:
<<: *airflow-common
entrypoint: /bin/bash
command: -c "airflow db init"
What I’ve checked so far
Scheduler & worker processes
docker exec -it airflow-scheduler ps aux→ scheduler process is running.
docker exec -it airflow-worker ps aux→ celery worker process is running.
Redis activity
docker exec -it redis redis-cli monitorI see continuous worker heartbeats (
PUBLISH ... worker.heartbeat) andBRPOPcalls, but no LPUSH with actual tasks.Triggering a DAG
docker exec -it airflow-scheduler airflow dags trigger process_fmu_and_save_local_dag→ DAGRun is created in state
queued, but tasks remain stuck and are never executed.Postgres confirms:
SELECT dag_id, run_id, state FROM dag_run ORDER BY run_id DESC LIMIT 5;→ state =
queuedSELECT dag_id, run_id, task_id, state FROM task_instance ORDER BY run_id DESC LIMIT 5;→ all tasks have
state = NULLorqueued, neverscheduled/running.Scheduler logs They keep showing:
... DEBUG - No tasks to consider for execution. ...
Why does the scheduler never push tasks into Redis / Celery, even though:
CeleryExecutoris configured- Redis is receiving heartbeats
- Workers are connected and polling
Is there anything wrong with my docker-compose.yml or environment variables (AIRFLOW__CELERY__RESULT_BACKEND, AIRFLOW__CELERY__BROKER_URL, etc.)?
What else can I check to debug why the scheduler is not enqueuing tasks?
I have the fear my server is not strong enough:
free -m
total used free shared buff/cache available
Mem: 5921 4053 450 100 1417 1468
Swap: 8191 1275 6916