4

Please show me the best/fast methods for:

1) Loading very small binary files into memory. For example icons;

2) Loading/reading very big binary files of size 512Mb+.

3) Your common choice when you do not want to think about size/speed but must do only thing: read all bytes into memory?

Thank you!!!

P.S. Sorry for maybe trivial question. Please do not close it;)

P.S.2. Mirror of analog question for Java;

1

4 Answers 4

5

1: For very small files File.ReadAllBytes will be fine.

2: For very big files and using .net 4.0 , you can make use MemoryMapped Files.

3: If Not using .net 4.0 than , reading chunks of data would be good choice

Sign up to request clarification or add additional context in comments.

1 Comment

All answers have reasons to be useful. But for big files i prefer MemoryMapped Files - my old good friends from native Windows API;)
5

1) I'd use a resource file rather than storing it as lots of separate files.

2) you probably want to stream the data rather than read it all at once, in which case you can use a FileStream.

3): Use ReadAllBytes:

byte[] bytes = File.ReadAllBytes(path);

Comments

3

1: For small, File.ReadAllBytes

2: For big, Stream (FileStream) or a BinaryReader on a Stream - the purpose being to remove the need to allocate a massive buffer, by changing the code to read small chunks consecutively

3: Go back and find the expected size; default to worst-case (#2)

Also note that I'd try to minimise the siE in the first place, perhaps via the choice of data-format, or compression.

Comments

0

This sample is good for both - for large files you need buffered reads.

 public static byte[] ReadFile(string filePath)
 {
  byte[] buffer;
  FileStream fileStream = new FileStream(filePath, FileMode.Open, FileAccess.Read);
  try
    {
     int length = (int)fileStream.Length;  // get file length
     buffer = new byte[1024];            // create buffer
     int count;                            // actual number of bytes read
     int sum = 0;                          // total number of bytes read

     // read until Read method returns 0 (end of the stream has been reached)
     while ((count = fileStream.Read(buffer, sum, length - sum)) > 0)
      sum += count;  // sum is a buffer offset for next reading
     }
     finally
     {
      fileStream.Close();
     }
      return buffer;
   }

7 Comments

That is no better ("good for both") than File.ReadAllBytes; why invent work?
(the main problem here being the need to allocate a huge buffer for large files)
Why do you need a huge buffer, a buffer size of 1024 would significantly improve the speed of reads.
@Max Malygin: Because in your code, you're creating a buffer the same size as the file - in example 2, you'd end up with a buffer 536,870,912 bytes in length, not 1,024 - you're basically just saying "Read everything into this huge buffer I've just created", rather than saying "Ok, the format of this binary is as follows: The first 2 bytes tell me what sort of file I've got; then next 50 bytes are the header, with the following details: 2 bytes: number of elements; 10 bytes: element length; etc", where you can chunk up your read to just those bytes you're interested in at that point.
@Max - this still doesn't work, you'll end up with an exception: "Offset and length were out of bounds for the array...". You're asking the Read method to try and put the entire file into the 1024 byte array (length-sum). If you were to change this to just read 1024 bytes, you'd still have an issue the second time round, as you're asking Read to start filling the byte array after the end of the array (from position 1024). Your call to read should probably look like: Read(buffer, 0, 1024) and then explicitly call out that you need to deal with the contents of the buffer. bit.ly/hxVLbH
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.