0

From the below link, I understand using Python SDK, we can create Pipeline and run it.

https://learn.microsoft.com/en-us/azure/data-factory/quickstart-create-data-factory-python

But I have tried to pass data flow parameters in parameters={} in create_run function and it is not working. (Looks like it is taking pipeline parameters and not data flow parameters)

Any example/ details would help!!

2
  • Can you show us your code? Commented Feb 9, 2021 at 1:27
  • credentials = UserPassCredentials('userid','pwd') adf_client = DataFactoryManagementClient(credentials, subscription_id) df = adf_client.factories.get(rg_name, df_name) p_name='DATA_STAGE' run_response = adf_client.pipelines.create_run(rg_name, df_name, p_name, parameters={"Table_name":"FILE_INFO","Target_Query":"select * from schema1.FILE_INFO where cast(adf_create_date as date) in ('2020-12-17','2020-12-16') ","Application_name":"ACAL"}) run_response.run_id pipeline_run = adf_client.pipeline_runs.get(rg_name, df_name, run_response.run_id) print(pipeline_run.status) Commented Feb 16, 2021 at 22:57

1 Answer 1

1

As far as I know, there is no direct way to do this. As a workaround, you can create pipeline parameters then pass them to Data Flow. Something like this:

enter image description here

Sign up to request clarification or add additional context in comments.

1 Comment

Thanks Steve. Yes that worked. I have created a pipeline parameter and passed that parameter to data flow.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.