0

I'm using SqlBulkCopy to insert records into a SQL Server database. I have a requirement to save the data into 3 tables based on the primary key which I get after inserting into 1st table. As the dataset inserts at a time without looping, I have no idea to get the primary key of the each inserted record.

Is there any possibility to get the primary keys of the rows inserted to use for the further insertions into different tables?

I'm converting the JSON input to DataSet and inserting that dataTable into SQL Server table by mapping the columns using SqlBulkCopy.

5
  • The data you are inserting should already have something that identifies the row - whether that is a single column or multiple columns. Assuming you get back this 'primary key' from the first table - how are you relating that to the row(s) in the second and third tables? Commented May 27, 2023 at 16:14
  • 1
    You can't get data back from a Bulk Copy transfer. It might be easier to just pass in the JSON and parse it out using OPENJSON etc. Please show tables and sample JSON Commented May 28, 2023 at 0:57
  • Is there any other efficient way to Bulk insert data into Sql table rather than using SqlBulkCopy as getting primary key becomes a complex problem with that? Commented May 28, 2023 at 7:22
  • Either use a Table Valued Parameter or INSERT... FROM OPENROWSET(BULK. Why do you think it's so difficult to get back a primary key when using OPENJSON? Commented May 28, 2023 at 10:31
  • @Charlieface Here is my sample Json: [{ "FirstName": "David", "LastName": "Szymanski", "RecipientSsn": "662-62-6311", "IsForeignAddress": "No", "Address1": "362 Main St", },] I'm converting this to DataTable and inserting it using BulkCopy. And for primary key I have an Identity column in my Database. Commented May 28, 2023 at 16:43

0

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.