I have an Excel file which I am required to parse, validate and then load into a SQL Server database using Interop. I have the application working and everything is fine by reading a sheet, reading each line (row and columns) and adding that line to an List as an Insert statement. When I reach the end of the Worksheet, I execute all of the Insert statements as one batch.
The problem I have is that it is using a lot of RAM when the worksheet is big (1000+ rows). Is there a better or more efficient strategy for larger data? Should I be committing more and clearing the List?