2

All, I have the following Append which I am performing when I am producing a single line for a fixed text file

formattedLine.Append(this.reversePadding ?
                     strData.PadLeft(this.maximumLength) :
                     strData.PadRight(this.maximumLength)); 

This particular exception happens on the PadLeft() where this.maximumLength = 1,073,741,823 [a field length of an NVARCHAR(MAX) gathered from SQL Server]. formattedLine = "101102AA-1" at the time of exception so why is this happening. I should have a maximum allowed length of 2,147,483,647?

I am wondering if https://stackoverflow.com/a/1769472/626442 be the answer here - however, I am managing any memory with the appropriate Dispose() calls on any disposable objects and using block where possible.

Note. This fixed text export is being done on a background thread.

Thanks for your time.

4
  • Your title and your body are out of sync - you claim that it's Append which is throwing in the title, but then in the body you say it's PadLeft. I strongly suspect that Append is irrelevant here. Commented Nov 27, 2012 at 19:06
  • Agreed. I will change this now... Commented Nov 27, 2012 at 19:08
  • calling Dispose on objects does not invoke the garbage collector. fyi. Commented Nov 27, 2012 at 19:11
  • @recursive I know. I understand this and no where here have I suggested it does. Thanks for the clarification though. Commented Nov 28, 2012 at 9:13

3 Answers 3

3

This particular exception happens on the PadLeft() where this.maximumLength = 1,073,741,823

Right. So you're trying to create a string with over a billion characters in.

That's not going to work, and I very much doubt that it's what you really want to do.

Note that each char in .NET is two bytes, and also strings in .NET are null-terminated... and have some other fields beyond the data (the length, for one). That means you'd need at least 2147483652 bytes + object overhead, which pushes you over the 2GB-per-object limit.

If you're running on a 64-bit version of Windows, in .NET 4.5, there's a special app.config setting of <gcAllowVeryLargeObjects> that allows arrays bigger than 2GB. However, I don't believe that will change your particular use case:

Using this element in your application configuration file enables arrays that are larger than 2 GB in size, but does not change other limits on object size or array size:

  • The maximum number of elements in an array is UInt32MaxValue.

  • The maximum index in any single dimension is 2,147,483,591 (0x7FFFFFC7) for byte arrays and arrays of single-byte structures, and 2,146,435,071 (0X7FEFFFFF) for other types.

  • The maximum size for strings and other non-array objects is unchanged.

What would you want to do with such a string after creating it, anyway?

Sign up to request clarification or add additional context in comments.

7 Comments

That should probably be a comment, not an answer. Also, he states a belief that he should have a 2G address space, so dealing with 1G is quite possibly what he has in mind.
Regarding your edit... why do you require a special app.config setting to allocate such large arrays under 64-bit .NET implementations?
@EricJ.: Not sure why it's only an option, to be honest.
Hi Jon. You are right in the fact that I don't particularly want to pad the fixed text field with that length but I am currently writting a utility to export ANY database table. This feild is nText and its max length is gathered automatically. Perhaps I need to look at restricting this max length. Thanks for your time.
@EricJ.: I don't see anything about a 2G address space - have I missed a comment somewhere here? Anyway, I hope you now think my answer is more than a comment :)
|
2

In order to allocate memory for this operation, the OS must find contiguous memory that is large enough to perform the operation.

Memory fragmentation can cause that to be impossible, especially when using a 32-bit .NET implementation.

2 Comments

Thanks for your answer. I have thought about this cause. But I am managing any disposible objects seemingly well. Is calling the GC a viable option here or is this merky water? Thanks for your time...
@Killercam: Try doing this with Visual Studio's memory profiler. You will see what is happening with memory allocation.
1

I think there might be a better approach to what you are trying to accomplish. Presumably, this StringBuilder is going to be written to a file (that's what it sounds like from your description), and apparently, you are also potentially dealing with large (huge) database records.

You might consider a streaming approach, that wont require allocating such a huge block of memory. To accomplish this you might investigate the following:

The SqlDataReader class exposes a GetChars() method, that allows you to read a chunk of a single large record. Then, instead of using a StringBuilder, perhaps using a StreamWriter ( or some other TextWriter derived class) to write each chunk to the output. This will only require having one buffer-full of the record in your application's memory space at one time. Good luck!

1 Comment

This is a good answer an is essentailly what I am doing. I am using an SqlReader to read a single row (I could trim this down to a single field in a row). I have found that the problem comes with the default padding applied to fields such a NTEXT which can be large (1,073,741,823). I think I just need to handle the default padding in a more reserved fashion... Thanks for your time.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.