I have been really struggling with this issue. I am writing a program that handles some large amounts of data that is housed on server in a SQL Database. There are millions of records in a table that contains 50 columns.
The first thing my program does is performs a query to retrieve a set of this data. The set size can range from 500 records to 1.5 million. 1.5 million is pretty rare, but it can happen. Currently I am retrieving the data and storing it in a DataTable and then using EPPlus to export it to excel. I start running out of memory around 150,000 records. It takes the entire task about 2-3 minutes to complete. I believe I am running out of memory when populating my DataTable.
The data is on a server in a SQL database and we must have an exported excel file to our local machine. These criteria must continue to be met.
How on earth do I do this?
Edit: Here is some sample code. I should also update that I do not care about formatting.
string query = "SELECT * FROM DB.dbo.Table WHERE [Client Name] = '" + clientName + "'";
System.Data.DataTable dt = new System.Data.DataTable();
using (SqlConnection sqlConn = new SqlConnection(connString))
{
using (SqlCommand cmd = new SqlCommand(query, sqlConn))
{
sqlConn.Open();
using (SqlDataReader myReader = cmd.ExecuteReader())
{
dt.Load(myReader);
sqlConn.Close();
}
}
}
I think I need to make an adjustment in the larger picture of things. As many of you have said that it is not possible to handle such amount of rows. I am going to look at another approach to the problem as a whole. Thank you for your help everyone!