1

The sample.sql file

INSERT INTO db.table
VALUES (
    '{{ ds }}',
    '${item1}',
    '{item2}'
)
;

The airflow operator

        task = DatabricksSqlOperator(
            task_id=task_id,
            sql_endpoint_name="my-serverless-endpoint",
            sql="sample.sql",
            parameters={
                "item1": "abc",
                "item2": "xyz"
            },
        )

The result db.table

Column A Column B Column C
2022-05-05 ${item1} {item2}

The marcro {{ ds }} works just fine. But I can't figure out how to get other parameter to work DatabricksSqlOperator is extending the SQLExecuteQueryOperator https://github.com/apache/airflow/blob/main/airflow/providers/databricks/operators/databricks_sql.py

I tried

  1. '${item1}',
  2. '{item1}
  3. '{params["item1"]}',
  4. '{param.item1}',
  5. '{parameters["item1"]}',
  6. '{parameters.item1}',

Hoping to get

Column A Column B Column C
2022-05-05 abc xyz

1 Answer 1

0

You need to use it as Jinja Template : {{ params.item1 }}

DatabricksSqlOperator inherited from SQLExecuteQueryOperator that parse it that way.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.