0

I know versions of this question appear frequently on here, but I was not able to find a solution that works for me. For some background on the problem, I have an RGB image that is divided into chunks of NxN pixels. I want to compute the average of each color channel separately for each chunk. I know numpy is best used by leveraging vectorized operations, but the level of higher-dimensional slicing and indexing required here is beyond me. Essentially I need the following functionality:

 for row in tiles:
     for col in row:
         rsum = 0;
         gsum = 0;
         bsum = 0;
         for n in col:
             for vec in n:
                 rsum += vec[0]
                 gsum += vec[1]
                 bsum += vec[2]
         col[..., 0] = rsum/n.shape[0]**2
         col[..., 1] = gsum/n.shape[0]**2
         col[..., 2] = bsum/n.shape[0]**2

Where the shape of my ndarray is:

tiles.shape = (138, 84, 100, 100, 4)

A 138x84 matrix of 100x100 matrices, where each element is a length-4 vector. Is there a way to do this without any loops? Should I reshape my ndarray? Any guidance is much appreciated.

1 Answer 1

1

Simply pass the axes you want to average over to np.mean:

avg = np.mean(tiles, axis=(2, 3))
Sign up to request clarification or add additional context in comments.

1 Comment

I don't think that's quite right yet. He's leaving the 4th channel alone.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.