0

I'm experiencing OutOfMemory exceptions in my application, when fetching from the database. It is a C# .Net application using Linq2Sql.

I have tried using GC.GetTotalMemory() to see how much memory is taken up before and after the call to the database. This gives a nice although not quite accurate picture of what is going on. When I look in the Windows Task Manager I can see that the Peak Working Set is not smaller when fetching the data in a paged manner using the following code:

public static void PreloadPaged()
{
    int NoPoints = PointRepository.Count();
    int pagesize = 50000;
    int fetchedRows = 0;

    while (fetchedRows < NoPoints)
    {
        PreloadPointEntity.Points.AddRange(PointRepository.ReadPaged(pagesize, fetchedRows));
        PointRepository.ReadPointCollections();
        PreloadPointEntity.PointCollections.Count());
        fetchedRows += pagesize;
    }
}


private static List<PointEntity> ReadPaged(int pagesize, int fetchedRows)
{
    DataModel dataContext = InstantiateDataModel();
    var Points = (from p in dataContext.PointDatas
                select p.ToEntity());

    return Points.Skip(fetchedRows).Take(pagesize).ToList();
}

I guess it's the Linq2Sql code that is using up the memory and not reusing it or freeing it afterwards, but what can I do to get the memory foot print down?

I have observed that it uses 10 times as much memory to fetch the data as it does to store them in my list of enties. I have considered invoking the garbage collector, but I would rather avoid it.

3
  • 2
    why you need 50000 rows to be retreived at once? Commented Jun 21, 2012 at 10:45
  • What you intended to do with those entities further? Commented Jun 21, 2012 at 10:51
  • 50000 is just my page size chosen for experimental purpose, I'm prefetching a grid of coordinates which all have to be processed. So I just store them in a list of entities and access them from there in stead of having to make a new DB connection every time I need another coordinate. Commented Jun 21, 2012 at 10:56

1 Answer 1

2

you are retrieving way too much data and storing it in memory, that's why you are getting an OOM exception.

1 of 2 things is occurring:

  1. you are loading an excessive amount of data when the user will only view a subset of the results and/or this is a 1st attempt at "caching" data.
  2. you do need all this data, but are using the wrong technology (Linq2Sql) to access the data.

if it's the first, you need to either

  1. load smaller chunks of data (20-50 records, not 50K or everything)
  2. if this is only for display purposes, then query a projection of what's needed, rather than the entity itself.

if it's the second than use an ETL tool designed to manage large amounts of data. I prefer Rhino.ETL but SSIS also works.

Sign up to request clarification or add additional context in comments.

1 Comment

Maybe Linq2SQL is not the best framework to use for the amounts of data, I'll look into the ETL tools you suggest here, thanks.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.