I have this code
//N = 32;
//B = 27;
using (FileStream fs = File.Open(path, FileMode.OpenOrCreate, FileAccess.ReadWrite))
{
using (BinaryReader br = new BinaryReader(fs))
{
using (BinaryWriter bw = new BinaryWriter(fs))
{
for (int k = B; k < N; ++k)
{
Console.WriteLine(k);
long pt = 0;
long j = 1L << k;
for (long i = 0; i < (1L << (N - 1)); ++i)
{
long b1;
long b2;
br.BaseStream.Seek(8 * (pt), SeekOrigin.Begin);
b1 = br.ReadInt64();
br.BaseStream.Seek(8 * (j - 1), SeekOrigin.Current);
b2 = br.ReadInt64();
long t1 = b1 + b2;
long t2 = b1 - b2;
bw.BaseStream.Seek(8 * (pt), SeekOrigin.Begin);
bw.Write(t1);
bw.BaseStream.Seek(8 * (j - 1), SeekOrigin.Current);
bw.Write(t2);
pt += 1;
if ((pt & (j - 1L)) == 0)
{
pt += j;
}
if ((i % 100000) == 0) Console.WriteLine(i);
}
}
}
}
}
What's happening is, the program reads two longs from different positions in a very large (17 GB) file, adds/subtracts them, then rewrites the new values in the same positions.
From what I can gather, the most efficient way to read data is to read a large chunk into a buffer and then work with that. However, that approach doesn't work here, because based on the values of pt and j, it could be reading from the beginning and end of the file, and of course I can't store all 17 GB in memory.
The line
if ((i % 100000) == 0) Console.WriteLine(i);
is for debugging, and it's about 2 seconds between them on my computer. I need this to be much faster. The paper I'm following said their implementation took less than 30 minutes for this loop. Is there a faster alternative for reading lots of numerical data quickly?
