6

Is there any way to parameterize an SQL INSERT statement (in C#), which inserts multiple rows? Currently I can think of only one way, to generate a statement for inserting mulitple rows, but that is quite open to SQL injection:

string sql = " INSERT INTO  my_table"
           + " (a, b, c)"
           + " VALUES";

// Add each row of values to the statement
foreach (var item in collection) {
    sql = sql
        + String.Format(" ({0}, {1}, {2}),",
              aVal, bVal, cVal);
}

// Remove the excessive comma
sql = sql.Remove(sql.Length - 1);

What is the smarter/safer way to do this?

6
  • You can refer to this question (in which the OP is correctly using parametrization) and the corresponding answer. Commented Aug 8, 2017 at 8:11
  • Have a look at the SqlBulkCopy class: msdn.microsoft.com/en-us/library/…, it provides an efficient way of bulk loading data into sql. Commented Aug 8, 2017 at 8:13
  • @RigertaDemiri I cannot see how I should implement this, in my exact case? I use parameterization elsewhere, but only when I work with single row inserts. How do I use it in multi-row insert? Commented Aug 8, 2017 at 8:15
  • @Riv unfortunately BulkCopy is not an option, as the rows are generated inside my program. Then I would have to save them in a file first, and then load them from there. But that hardly seems like the smartest way? Commented Aug 8, 2017 at 8:17
  • 3
    @Noceo, you don't have to save to file first, you can create a DataTable for your data and use it as the source. The WriteToServer method of SqlBulkCopy has multiple overloads Commented Aug 8, 2017 at 8:22

1 Answer 1

9

You could add parameters inside the loop, like:

using (var comm = new SqlCommand()) {
        var counter = 0;
         foreach (var item in collection) {
            sql = sql + String.Format(" (@a{0}, @b{0}, @c{0}),"  counter);
            
            comm.Parameters.AddWithValue("@a" + counter, aVal); 
            comm.Parameters.AddWithValue("@b" + counter, bVal);
            comm.Parameters.AddWithValue("@c" + counter, cVal);
            counter++;
        }
    }

But I really wouldn't do a multi-row insert like this. IIRC the maximum number of parameters in a query is about 2100, and this could get very big very fast. As you're looping through a collection anyway, you could just send it to the database in your loop, something like:

using (var con = new SqlConnection("connectionString here"))
{
   con.Open();
   var sql = "INSERT INTO  my_table (a, b, c) VALUES (@a,@b,@c);"

   using (var comm = new SqlCommand(sql, con))
   {
       comm.Parameters.Add("@a", SqlDbType.Int);
       comm.Parameters.Add("@b", SqlDbType.NVarChar);
       comm.Parameters.Add("@c", SqlDbType.Int);
       foreach (var item in collection) {
       {
           comm.Parameters["@a"].Value = aVal;
           comm.Parameters["@b"].Value = bVal;
           comm.Parameters["@b"].Size = bVal.Length;
           comm.Parameters["@c"].Value = cVal;
        
           comm.ExecuteNonQuery();
       }
   }
}

The statement is prepared only once (and faster than a huge statement with 100's of parameters), and it doesn't fail all records when one record fails (add some exception handling for that). If you want to fail all when one record fails, you could wrap the thing up in a transaction.

Edit: Of course, when you regularly have to input 1000's of rows, this approach isn't the most efficient either, and your DBA might start to complain. There are other approaches to this problem to remove the strain from the database: for example, create a stored procedure in your database that will insert the data from an xml document, or use Table Valued Parameters. NYCdotNet wrote 2 nice blogs about these options, which I won't recreate here, but they're worth exploring (I'll paste some code below from the blog, as per guidelines, but credit where it's due: NYCdotNet) XML document approach Table Valued Parameters

The "meat" from the blog about TVP (in VB.NET but that shouldn't matter):

So I created this "generic" table-valued type:

 CREATE TYPE dbo.UniqueIntegerList AS TABLE
 
 (
 
        TheInteger INT NOT NULL
 
     PRIMARY KEY (TheInteger)
 
 );

Creating the Save Stored Procedure

Next, I created a new stored procedure which would accept my new Table-Valued Type as a parameter.

 CREATE PROC DoTableValuedParameterInsert(@ProductIDs
 dbo.UniqueIntegerList READONLY)
 
 AS BEGIN
 
       
 
        INSERT INTO ProductsAccess(ProductID)
 
        SELECT TheInteger AS [ProductID]
 
        FROM @ProductIDs;
 
  
 
 END

In this procedure, I am passing in a parameter called @ProductIDs. This is of type "dbo.UniqueIntegerList" which I just created in the previous step. SQL Server looks at this and says "oh I know what this is - this type is actually a table". Since it knows that the UniqueIntegerList type is a table, I can select from it just like I could select from any other table-valued variable. You have to mark the parameter as READONLY because SQL 2008 doesn't support updating and returning a passed table-valued parameter.

Creating the Save Routine

Then I had to create a new save routine on my business object that would call the new stored procedure. The way you prepare the Table-Valued parameter is to create a DataTable object with the same column signature as the Table-Valued type, populate it, and then pass it inside a SqlParameter object as SqlDbType.Structured.

 Public Sub SaveViaTableValuedParameter()
 
  
 
   'Prepare the Table-valued Parameter'
 
  Dim objUniqueIntegerList As New DataTable
 
  Dim objColumn As DataColumn =
  objUniqueIntegerList.Columns.Add("TheInteger", _
 
  System.Type.GetType("System.Int32"))
 
   objColumn.Unique = True
 
  
 
   'Populate the Table-valued Parameter with the data to save'
 
   For Each Item As Product In Me.Values
 
     objUniqueIntegerList.Rows.Add(Item.ProductID)
 
   Next
 
  
 
   'Connect to the DB and save it.'
 
   Using objConn As New SqlConnection(DBConnectionString())
 
     objConn.Open()
 
     Using objCmd As New SqlCommand("dbo.DoTableValuedParameterInsert")
 
       objCmd.CommandType = CommandType.StoredProcedure
 
       objCmd.Connection = objConn
 
       objCmd.Parameters.Add("ProductIDs", SqlDbType.Structured)
 
  
 
       objCmd.Parameters(0).Value = objUniqueIntegerList
 
  
 
       objCmd.ExecuteNonQuery()
 
     End Using
 
     objConn.Close()
 
   End Using
 
 End Sub
Sign up to request clarification or add additional context in comments.

9 Comments

That is an alternative suggestion, but maybe not a bad idea. I am 100% sure how this would affect performance, but most likely it isn't an issue.
Added a variant for multiple records at once. I still wouldn't do that though :)
One question: If I use try, catch and finally, in where I have con.Close() in the finally statement. On what level do I then have to run con.Open()?
depends on what you want to achieve, but you could do a try-catch block inside the loop, so the loop continues when one record fails.
This is actually exactly what we are trying to get away from. Our DBAs are complaining about the strain we were putting on our database server by doing tens of thousands of small queries in a loop rather than doing some kind of set based process.
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.