0

I'm in airflow 2.10.2. I'm using a DockerOperator to generate a list of configuration (a list of dict). My goal is to pass those configurations to a TriggerDagOperator task, expanding the list to trigger the dag with all the different configuration I need. Something like:

task_generate_conf = DockerOperator(
   task_id = gen_conf
   ...
)

task_process_configs = TriggerDagRunOperator.partial(
    task_id='process_configs',
    trigger_dag_id='External.DAG_ID',
    ...,
).expand(conf=task_generate_conf.return_value)

In Airflow's documentation, it says that most of the operator pass data to XCom by return, and the return type must be a serializable variable (dict, str, list). However, DockerOperator documentation doesn't give any feedback on this. Looking at DockerOperator parameters, looks like this operator passes the content of the stdout or a file_path, but it doesn't give much details on how this works. How do I achieve my purpose? Should I create a json file and have a further task that reads it and pass the configurations to XCom so that I can trigger the External Dag? Is there like a straight solution from the docker operator to the External Dag? So far I tried to return the list of configuration but I get airflow.exceptions.UnmappableXComTypePushed: unmappable return type 'str' that is kind of weird given than in the documentaion str is listed as possible output. Also, i'm passing the entire list of dicts, so I don't get from where it gets the string but I believe it referets to stdout, actually used by the logger.

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.