I am working with code inside of a PostgreSQL pgsql function. I have a merge statement that has multiple elements that are only known at runtime and are supplied to the function as parameters, which requires me to build the merge statement as a dynamic SQL statement that I execute. These dynamic elements include the target table for the merge and the source data, which I have as an array of composite records.
How can I pass my array of composite records into the execute statement so that the merge works correctly? I know how to work with data values, column names, and table names in dynamic statements, but I haven't cracked how to pass in tables or arrays of values.
I can't seem to figure out how to use unnest to do it. A local temporary table doesn't seem to have scope within the execute transaction. A regular table, referred to directly in the merge, would create contention and problems with parallel execution of the function.
CREATE TEMPORARY TABLE bulk_data_table OF dbo."compositerecordtype";
INSERT INTO bulk_data_table (cola, colb, colc) SELECT BM.cola, BM.colb, BM.colc FROM UNNEST("user_bulk_data"::compositerecordtype) AS BM;
dynamic_sqlStatement :=
'MERGE INTO ' || "TableName" || ' AS TRG
USING (SELECT cola, colb, colc FROM ' || bulk_data_table || ' AS SRC
ON (TRG.id = SRC.id)
WHEN NOT MATCHED THEN
INSERT (cola, colb, colc) VALUES (SRC.cola, SRC.colb, SRC.colc)
WHEN MATCHED THEN
UPDATE SET colb = SRC.colb, colc = SRC.colc;';
EXECUTE dynamic_sqlStatement
... FROM ' || bulk_data_table || ' AS SRC ...is not going to work. You will either need to include the table name as a string or better yet useformatas shown here Dynamic commands.A local temporary table doesn't seem to have scope within the execute transactionThat's not true, your temp table is only visible for your current database connection. Where shouldTableNameanduser_bulk_datacome from? And what is their structure?