3

I need help. I've create a pipeline for data processing, which is importing csv and copy data to DB. I've also configure a Blob storage trigger, which is triggering pipeline with dataflow, when specific file will be uploaded in to container. For the moment, this trigger is set to monitor one container, however I would like to set it to be more universal. To monitor all containers in desired Storage Account and if someone will send some files, pipeline will be triggered. But for that I need to pass container name to the pipeline to be used in datasource file path. for now I've create something like that:

in the pipeline, I've add this parameter

@pipeline().parameters.sourceFolder:

enter image description here

Next in trigger, I've set this:

enter image description here

Now what should I set here, to pass this folder path?

enter image description here

1
  • You can use dataset parameters for the same. Commented Jan 20, 2023 at 10:18

3 Answers 3

4

You need to use dataset parameters for this.

Like folderpath parameter in pipeline create another pipeline parameter for the file name also and give @triggerBody().folderPath and @triggerBody().fileName to those when creating trigger.

Pipeline parameters:

enter image description here

Make sure you give all containers in storage event trigger while creating trigger.

Assiging trigger parameters to pipeline parameters:

enter image description here

Now, create two dataset parameters for the folder and file name like below.

Source dataset parameters:

enter image description here

Use these in the file path of the dataset dynamic content.

enter image description here

If you use copy activity for this dataset, then assign the pipeline parameters values(which we can get from trigger parameters) to dataset parameters like below.

enter image description here

If you use dataflows for the dataset, you can assign these in the dataflow activity itself like below after giving dataset as source in the dataflow.

enter image description here

Sign up to request clarification or add additional context in comments.

3 Comments

Thanks, that works, however now when the pipeline end's with this error: source1': abfss://container/[email protected]/ has invalid authority
folderpath gives container/folder and fileName gives as mycsv.csv, it means the file path is container/folder/mycsv.csv. can you share error screenshot, dataset and copy activity dataset parameters assignment in ADF?
Hello Rakesh. I see my misstake. From the @dataset().folder in dataset configuration for File Path. or from @pipeline().parameters.nameFolder in pipeline configuration. I need to extract only f.eg. container name and folder path. So f.eg. in Dataset configuration i need: from the @dataset().folder extract container name and set it in first box, then again from @dataset().folder need to extract folder path and put it in to second box. But I need guide how to use those functions. Cause I don't know what to do now. substring maybe will help, but f.eg. folder path will change
0

Thank you Rakesh

I need to process few specific files from package that will be send to container. Each time user/application will send same set of files so in trigger I'm checking does new drive.xml file was send to any container. This file defines type of the data that was send, so if it comes, I know that new datafiles has been send as well and they will be present in lover folder.

For example drive.xml was found in /container/data/somefolder/2022-01-22/drive.xml and then I know that in /container/data/somefolder/2022-01-22/datafiles/, are located 3 files that I need to process.

Therefor in parameters, I need to pass only file path, file names will be always the same.

The Dataset configuration looks like this:

enter image description here

enter image description here

And the event trigger like that:

enter image description here

Comments

0

I had to add the parameters directly to the JSON file for a storage trigger. FYI if anyone can't find these options, enter it in manually. Example JSON trigger file with parameters at the bottom of this article.

https://learn.microsoft.com/en-us/azure/data-factory/how-to-use-trigger-parameterization

1 Comment

As it’s currently written, your answer is unclear. Please edit to add additional details that will help others understand how this addresses the question asked. You can find more information on how to write good answers in the help center.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.