1

I have been really struggling with this issue. I am writing a program that handles some large amounts of data that is housed on server in a SQL Database. There are millions of records in a table that contains 50 columns.

The first thing my program does is performs a query to retrieve a set of this data. The set size can range from 500 records to 1.5 million. 1.5 million is pretty rare, but it can happen. Currently I am retrieving the data and storing it in a DataTable and then using EPPlus to export it to excel. I start running out of memory around 150,000 records. It takes the entire task about 2-3 minutes to complete. I believe I am running out of memory when populating my DataTable.

The data is on a server in a SQL database and we must have an exported excel file to our local machine. These criteria must continue to be met.

How on earth do I do this?

Edit: Here is some sample code. I should also update that I do not care about formatting.

string query = "SELECT * FROM DB.dbo.Table WHERE [Client Name] = '" + clientName + "'";
System.Data.DataTable dt = new System.Data.DataTable();
using (SqlConnection sqlConn = new SqlConnection(connString))
{
     using (SqlCommand cmd = new SqlCommand(query, sqlConn))
{
     sqlConn.Open();
     using (SqlDataReader myReader = cmd.ExecuteReader())
     {
           dt.Load(myReader);
           sqlConn.Close();
     }
}
}

I think I need to make an adjustment in the larger picture of things. As many of you have said that it is not possible to handle such amount of rows. I am going to look at another approach to the problem as a whole. Thank you for your help everyone!

3
  • 4
    There are row limits on Excel spread sheets. 65k for Excel 2003, 1meg for 2010.. not sure about others. Commented Dec 3, 2013 at 22:30
  • 1
    It will be impossible to ever export more than 1 million records to Excel. That is the most records that Excel will allow in a single workbook. Commented Dec 3, 2013 at 22:32
  • Can you include some code so we can see what you are currently doing? Commented Dec 3, 2013 at 23:46

3 Answers 3

1

DataTable is apparently not helping here. You could use StramWriter directly without DataTable and write directly to a CSV file instead of Excel file (your question is not showing that you need formulas, formatting, etc.).

Also, it would help to put yourself in the place of the end user. A user is probably going to find issues dealing with an excel file of 1 million rows.

Sign up to request clarification or add additional context in comments.

1 Comment

I think I will try this since I do not care about formulas or formatting. Thank you
0

First of all, you are already using an SqlDataReader. This is meant to be for record-by-record processing. If you create a

while (reader.Read())
{
   // Export row
}

loop, you should not run into memory issues - it may be slow, however.

In the above // Export row part, you may want to write to an Excel sheet using ADO, which basically comes down to opening a database connection to the SQL file and INSERTing as you'd insert into a normal database. There should be code on SO to show you how to do this (for example this one).

[ Side note on how I'd do it:

I'd create an Excel template file in my application resources. Then, I'd save that to disk to create a fresh file upon each export and then construct a connection string to access that fresh file. Then I'd INSERT using normal OleDbCommands.

]

However, as others have noted already, 1.5 mio rows is not an amount of data Excel can handle.

Comments

0

If you are setting values and formatting to every cell or each row, it is going to take time.

You should always use set_range("A1:A50") = string array. You may create a string array of 50 string and set it with set_range();

I do not know about EPPlus disposing, if you are creating lot of objects to set your DataTable and not disposing it, it will throw out of memory issue as those might not be CLR objects.

You may try to use multiple Sheets to write your data for user readability.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.