I have a airflow DAG which calls databricks job that has a task level parameters defined as job_run_id (job.run_id) and has a type as python_script. When I try to access it using sys.argv and spark_python_task, it only prints the json that has passed through the airflow job. I want that sys should be able to get both the parameters passed by DAG and databricks job.
We have a use case where we don't want to use anything related to dbutils. Its a python script so we want it to be independent of dbutils.
If i pass like below without passing any parameters from dag
{ "spark_python_task": [] } it does print job id in sys.argv
but when i pass like below with json format parameter from airflow dag
{ "spark_python_task": [{any test json}] }
it only prints json in sys.argv and no job id
I want the python script to print both task level parameters and parameters passed by airflow dag.
`to format inline code, and```to format block of code.