1

Let's say I have a dataset in an ASP.NET website (.NET 3.5) with 5 tables, each has roughly 30,000 rows and an average of 12 columns. I want to insert all of the data from the dataset into 5 very-similar-but-not-quite-identical tables in SQL Server 2008. I also want to use LINQ (personal preference - trying to learn something new).

Is it as simple as iterating through the dataset and, for each row, creating a new instance of the associated class, initializing its data with the dataset's row, adding it to the data model, and then doing one giant SubmitChanges at the end?

Are there better ways of doing this with LINQ? Or is this the de-facto standard?

1
  • I'm primarily interested in LINQ to SQL, but I didn't want to keep people from referencing LINQ to Datasets. When all is said and done, I'm merely seeking best practices and/or plumbing shortcuts that experienced people might be aware of when it comes to working with datasets, SQL, and the various flavors of LINQ. Commented Apr 19, 2010 at 13:33

3 Answers 3

2

Creating objects and inserting them is fine. But to avoid a gigantic commit at the end, you might want to perform a SubmitChanges() every 100 rows or so.

Alternately you could get a copy of Red Gate's "SQL Data Compare" utility if you have the cash. Then you never have to write one of these things again. :-)

Edit 2010-04-19: If you want to use a transaction, I think you should still use my approach instead of a single SubmitChanges(). In this case you'll want to explicitly manage your own transaction in L2S (see http://msdn.microsoft.com/en-us/library/bb386995.aspx). Run your queries in a try/catch and roll back the transaction if you get any failures.

Two last bits of advice:

  1. Make sure your ASP.NET timeout is set high enough.
  2. Consider printing out some kind of progress indicator. It makes running these kind of long-running things much more palatable.
Sign up to request clarification or add additional context in comments.

1 Comment

Assuming I want everything bundled into a transaction, would it be more appropriate to do commits every 100 records or a single huge commit at the end? Or have I already passed the "reasonable solution" line by importing this much data through a Web app? :)
2

Linq To Sql doesn't natively have anything like the SqlBulkCopy class. I did a quick search and it looks like there's an implementation for Linq To Sql. No clue if it is any good but it can't hurt to check it out.

Comments

1

DataContext.ExecuteCommand can be used with an arbitrary SQL statement. You could do a "INSERT FROM".

1 Comment

Interesting idea - I will try to spend some time with this today.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.