0

I am using Data flow in my Azure Data factory pipeline in order to copy data from one cosmos db collection to another cosmos db collection. I am using cosmos SQL Api as the source and sink datasets.

Problem is when copying the documents from one collection to other,I would like to add an additional column whose value will be same as one of the existing key in json. I am trying with Additional column thing in Source settings but i am not able to figure out how can I add an existing column value in there. anyone with any help on this_

1 Answer 1

1

In case of copy activity, you can assign the existing column value to new column under Additional column by specifying value as $$COLUMN and add the column name to be assigned.

enter image description here

If you are adding new column in data flow, you can achieve this using derived column

enter image description here

enter image description here

Sign up to request clarification or add additional context in comments.

5 Comments

i did this but copy activity is super slow and i get time out error. what should be the reason.
Try increasing the writeBatchSize, this improves performance.
What value should I put it. right now its blank
and if i use dataflow its super fast, but cant use data flow as it deviates my schema in the output
That depends on your data, this link might help.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.