1

I am working with code inside of a PostgreSQL pgsql function. I have a merge statement that has multiple elements that are only known at runtime and are supplied to the function as parameters, which requires me to build the merge statement as a dynamic SQL statement that I execute. These dynamic elements include the target table for the merge and the source data, which I have as an array of composite records.

How can I pass my array of composite records into the execute statement so that the merge works correctly? I know how to work with data values, column names, and table names in dynamic statements, but I haven't cracked how to pass in tables or arrays of values.

I can't seem to figure out how to use unnest to do it. A local temporary table doesn't seem to have scope within the execute transaction. A regular table, referred to directly in the merge, would create contention and problems with parallel execution of the function.

CREATE TEMPORARY TABLE bulk_data_table OF dbo."compositerecordtype";
INSERT INTO bulk_data_table (cola, colb, colc) SELECT BM.cola, BM.colb, BM.colc FROM UNNEST("user_bulk_data"::compositerecordtype) AS BM;

dynamic_sqlStatement := 
'MERGE INTO ' || "TableName" || ' AS TRG 
USING (SELECT cola, colb, colc FROM ' || bulk_data_table || ' AS SRC
ON (TRG.id = SRC.id)
WHEN NOT MATCHED THEN
    INSERT (cola, colb, colc) VALUES (SRC.cola, SRC.colb, SRC.colc)
WHEN MATCHED THEN
    UPDATE SET colb = SRC.colb, colc = SRC.colc;';

EXECUTE dynamic_sqlStatement
12
  • 1
    1) Update the question to include the complete function code. 2) This ... FROM ' || bulk_data_table || ' AS SRC ... is not going to work. You will either need to include the table name as a string or better yet use format as shown here Dynamic commands. Commented Oct 2, 2024 at 16:08
  • I understand that what is posted doesn't work. I tried unnest(), format, and "USING" variations without success. The code does work with a regular schema table and not a temporary table. Commented Oct 2, 2024 at 16:15
  • You have not done what asked in 1). You have also not defined what without success means? In other words: a) Do you get errors and if so what are they? or b) Does the command complete but the result is incorrect and if so what is expected and what is actually output? Commented Oct 2, 2024 at 16:22
  • I appreciate the attention to this, but the rest of the function is not relevant in any way. This is a single statement problem: a dynamic merge statement. The problem is how to dynamically specify a target table and provide a "FROM" table of data that can not be shared with other sessions. Surely, this isn't some type of unique requirement. Commented Oct 2, 2024 at 16:30
  • A local temporary table doesn't seem to have scope within the execute transaction That's not true, your temp table is only visible for your current database connection. Where should TableName and user_bulk_data come from? And what is their structure? Commented Oct 2, 2024 at 16:47

1 Answer 1

0
  1. Don't pass the target "TableName" into the function, at all. Instead, use that table name to cast the incoming array.
  2. Inside the function, look up the target table in pg_type. All regular types have an array version listed there. This means that based on the pg_typeof() of the incoming array, you can look up its base element type, which will be the target table.
  3. The function can accept anyarray() so that you don't need to define multiple versions for different targets.
  4. Use positional placeholders %1$I to pass the target table into your query via format().
  5. Use $1 in the query to pass the incoming payload via execute..using rather than interpolating it into the statement via format(). There's also no need to write it anywhere, even temporarily - just hand it over to the merge directly by letting it unnest() the argument.

demo at db<>fiddle

create or replace function merge_payload_into_table(
  payload anyarray)returns void as $f$
declare base_type text:=(select typelem::regtype 
                         from pg_type 
                         where oid=pg_typeof(payload)::regtype);
begin execute format(
  $dsql$
  MERGE INTO %1$I AS TRG 
  USING (SELECT id, cola, colb, colc FROM unnest($1::%1$I[]) ) AS SRC
  ON (TRG.id = SRC.id)
  WHEN NOT MATCHED THEN
    INSERT (cola, colb, colc) VALUES (SRC.cola, SRC.colb, SRC.colc)
  WHEN MATCHED THEN
    UPDATE SET colb = SRC.colb, colc = SRC.colc;
  $dsql$, base_type) USING payload;
end $f$ language plpgsql;
  1. You can look up columns in information_schema.columns and construct the column list in your merge dynamically, leaving only the id you're matching on, static. If you want to go fully dynamic, you can look up primary key columns or unique column combinations in pg_index and let the merge..on condition get constructed dynamically.
  2. Schema qualification could be improved - ideally, you should look up the pg_type.typnamespace as well.

create schema dbo;
create table dbo."compositerecordtype"(
   id int generated by default as identity primary key
  ,cola int
  ,colb int
  ,colc int);
set search_path to dbo,public;

insert into "compositerecordtype"(cola,colb,colc) 
values(1,2,3)
returning*;
id cola colb colc
1 1 2 3
select merge_payload_into_table(array[ row(1,101,102,103)
                                      ,row(2,201,202,203)
                                      ,row(3,301,302,303)]::compositerecordtype[] );
select*from compositerecordtype;
id cola colb colc
1 1 102 103
2 201 202 203
3 301 302 303
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.