I am currently creating an ingest pipeline to copy data from a delta table to a postgres table. When selecting the sink, I am asked to enable staging.
Direct copying data from Azure Databricks Delta Lake is only supported when sink dataset is DelimitedText, Parquet or Avro with Azure Blob Storage linked service or Azure Data Lake Storage Gen2, for other dataset or linked service, please enable staging
This will turn my pipeline into a 2 step process where my delta table data is copied to a staging location and then from there it is inserted into postgres. How can I take the delta table data and directly load it directly into postgres using an ingest pipeline in ADF without staging? Is this possible?