I've run into a stall trying to put together some code to average out 10x10 subarrays of a 2D multidimensional array.
Given a multidimensional array
var myArray = new byte[100, 100];
How should I go about creating 100 subarrays of 100 bytes (10x10) each.
Here are some examples of the value indexes the subarrays from the multidimensional would contain.
[x1,y1,x2,y2]
Subarray1[0,0][9,9]
Subarray2[10,10][19,19]
Subarray3[20,20][29,29]
Given these subarrays, I would then need to average the subarray values to create a byte[10,10] from the original byte[100,100].
I realize this is not unbelievably difficult, but after spending 4 days debugging very low-level code and now getting stuck on this would appreciate some fresh eyes.