2

Say, we have a table with some large text field containg jpg-files' binary data. The task is to get those files from a database on disk. So, at first I decided to do the following:

MyDataContext dc = new MyDataContext();
foreach(ImageTable t in dc.ImageTable.OrderBy(i=>i.Id))
{
    using (StreamWriter writer = new StreamWriter(new FileStream(string.Concat(t.Name,".jpg"), FileMode.CreateNew), Encoding.GetEncoding(1251)))
    {
          writer.Write(t.Data);
          writer.Close();
    }
}

But as soon as the table had about 20 thousand of rows, in a while I got an OutOfMemoryException.

In the end, to avoid loading of all the rows into one datacontext I did the following:

MyDataContext dc = new MyDataContext();
foreach(int id in dc.ImageTable.OrderBy(i=>i.Id).Select(i=>i.Id))
{
     using (MyDataContext _dc = new MyDataContext())
     {
           ImageTable t = _dc.ImageTable.FirstOrDefault(i => i.Id == id);
           using (StreamWriter writer = new StreamWriter(new FileStream(string.Concat(t.Name,".jpg"), FileMode.CreateNew), Encoding.GetEncoding(1251)))
           {
                writer.Write(t.Data);
                writer.Close();
           }
      }
}    

So each row is loaded by a separate datacontext...no memory problem left! But surely it's not the best approach to do the task.

Could anyone suggest something?

3
  • Creating a new data context is a lightweight operation, so it's not a very big deal. That being said, turning off object tracking as spender mentioned is the right solution. Commented Aug 15, 2012 at 23:41
  • Firstly, what for do you need 20 000 rows of data at once??? Secondly, WHY you store jpgs in a db? The fact "you can" doesn't mean "you should". Commented Aug 15, 2012 at 23:43
  • I don't store pictures there...some corporate software does...my task was to get them out from the db to import into some other system...So, obviously all the lines had to be exported to disk) Commented Aug 15, 2012 at 23:49

2 Answers 2

3

You could try switching off object tracking:

_dc.ObjectTrackingEnabled = false;
Sign up to request clarification or add additional context in comments.

Comments

2
  1. If it already works: resolving memory issue, with performance that fits the needs of your application, it's a good solution.

  2. If you're still not satisfied with the results, you may think about leaving linq to sql and look on raw SQL use SqlDataReader with readonly, forward only cursor, to get maximum efficiency in read operation.

Hope this helps.

2 Comments

Linq-To-Sql has ramifications on performance, but when used properly has little bearing on out of memory errors.
@Tigran Though it was a one-time task and the task is really done, I tested it with ObjectTrackingEnabled = false...and that was it! When I was solving my memory problem, I (completely forgot about ObjectTracking) understood that a simlpe DataReader would do the trick, but anyway it was quicker for me to do it with L2S. Thank you!

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.