1

I have an Excel file which I am required to parse, validate and then load into a SQL Server database using Interop. I have the application working and everything is fine by reading a sheet, reading each line (row and columns) and adding that line to an List as an Insert statement. When I reach the end of the Worksheet, I execute all of the Insert statements as one batch.

The problem I have is that it is using a lot of RAM when the worksheet is big (1000+ rows). Is there a better or more efficient strategy for larger data? Should I be committing more and clearing the List?

2
  • Don't know if you save that much on memory, but a "better or more efficient strategy for larger data" would probably be using SqlBulkCopy. Commented Aug 23, 2013 at 10:35
  • 1
    for other suggestions it would be better if we could see your code. Possibly there are some other lines to improve efficiency. Commented Aug 23, 2013 at 10:45

1 Answer 1

2

I don't think there is much you can do on the parsing side (unless you are coding it all yourself), but I'd INSERT the data as soon as you have a row available. No need to store it in a list. In your solution, you are basically storing all data twice (once in the "Excel memory" and once in "database insert memory").

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.