3

My current project includes a vegetation simulation, capable of rendering a large number of instanced tree models that grow and reproduce over time.

I'm currently getting consistent OutOfMemory exceptions on this line of code

if (treeInstances.Length <= currentIndex)
    Array.Resize(ref treeInstances, currentIndex + 500);

This code runs when the simulation exceeds the usual bounds of the treeInstances array, and causes it to allocate a new array with an additional 500 slots for trees.

Given that I can see the size of the array when it fails (usually somewhere between 3000 and 5000 instances) and the size of the TreeInstance struct (20 floats), I'm sure my problem lies not in the raw size of the array. Even considering that it has to be temporarily doubled during the resize/8 process (since Array.Resize() allocates a new array) that's still less than half a MB assuming my math is right.

I assume therefore that there must be something I'm missing. Is there some reason the old arrays might not be removed by the garbage collector?

Further Details:

  • TreeInstance is a simple struct, with the transform matrix and colour of each tree.
  • treeInstances is a TreeInstance[] array. It is only used directly here, in the lines of code above.
  • treeInstances also has a Property, TreeInstances, which accesses it via get;set;
  • TreeInstances is used to set the transform matrix and colour of each tree as it grows, and is fed into the Instancing methods as part of the Draw routine.
  • The Instancing methods I'm less familiar with, but perform a variety of functions with TreeInstances without modifying it's contents (including using it as the source in a DynamicVertexBuffer.SetData operation).
3
  • 1
    Why are you allocating your own arrays? Have you considered using List<T>? The +500 allocation might not be the most efficient way to grow your array depending on the array sizes involved. They may end up on the Large Object Heap and leave holes. My advice is to use List<T> if possible. If not use a memory profiler and see what's taking up memory. Commented Oct 25, 2013 at 1:40
  • 1
    Considering you have 20 floats in your struct and you have between 3000-5000 elements in the array, you're definitely ending up on the large object heap. One strategy would be pre allocate the arrays with a large enough size. Commented Oct 25, 2013 at 1:44
  • I'm using my own arrays because the instancing draw methods take arrays as parameters, and it seemed a better approach to allocate the arrays directly than to call List<T>.ToArray() 60 times a second. I can't be certain at this stage, though. Commented Oct 25, 2013 at 5:29

2 Answers 2

1

C# is designed to handle small memory allocation much better than large ones. When you do an Array.Resize you are forcing a new memory block to be allocated, the data copied then the old block invalidated. It is a very effective algorithm for fragmenting your heap :-)

If you know how large your array needs to be at the start, make your array that size. If you don't I suggest you use List or a similar class. That class allocates on a per item basis.

I stand corrected, thanks guys for keeping me honest. I'm too use to dealing with classes rather than structs. I should have been more awake.

If TreeInstance is changed to a class, then List becomes and array of addresses and the TreeInstance's can/will be allocated in smaller chunks. Some code changes will be required to new up all the TreeInstance's.

Sign up to request clarification or add additional context in comments.

2 Comments

Wrong; List<T> is backed by a single array.
You do realize List<T> uses an array as its underlying storage mechanism? It also does not allocate space per item, it allocates a new array double the size of the current underlying array and copies the current array to it when it runs out of space.
0

Looks like I found the answer to my question, following up Dweeberly's description of my Array.Resize() system being "a very effective algorithm for fragmenting your heap". It was a conceptualization issue on my part: I didn't understand that Out Of Memory Exceptions could be caused by not having enough contiguous memory, instead assuming that I was hitting some sort of limit due to Garbage Collection not catching the arrays.

This blog post by Eric Lippert set me straight:

http://blogs.msdn.com/b/ericlippert/archive/2009/06/08/out-of-memory-does-not-refer-to-physical-memory.aspx

Well worth a read for anyone dealing with Out Of Memory Exceptions, or as general knowledge for anyone in games programming.

The short answer is this: in a program compiled for 32-bit Windows, repeatedly assigning and then removing large objects, as I was doing via Array.Resize(), can fragment your address space into 'blocks' of empty space as large as the objects you are assigning. Subsequently, trying to assign an object larger than any of these free blocks will throw an Out Of Memory Exception, even your cumulative memory is far larger.

The appropriate response, as suggested above, is simply to avoid repeatedly re-sizing arrays: I just needed to understand why. In my case, this meant re-writing the Instanced Model methods to draw only a subset of a larger array, rather than the entire array. After that it was a simple method of allocating a significantly larger array than I could ever need during initialisation.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.