I am new to Azure Data Lakes. Currently I am using ADLS Gen2. I have configured pipelines for single source data sets to destination data sets. This is working fine. Then I configured the pipeline for uploading / ingesting whole database from on-prem to ADLS gen2.
Here is the process: Factory studio -> Ingest -> Built-in copy task -> sql server as Data store and configured runtime -> Select All tables (Existing tables) -> and so on.
Then from pipelines when I trigger this activity, it successfully uploads all tables to containers in separate files(based on source table names)
Then when I update data in one of the tables in sources it successfully updates the data in destination file.
But the problem is when I add new table in database and then triggers the activity, this new table is not uploaded. Is there a way to update source data set to include this new table?
I have seen all the properties of source data set and activity in pipeline. Also searched for the solution, but stuck in this scenario.






when I update data in one of the tables in sources it successfully updates the data in destination file. When you update the tables, are you running the pipeline again?