2

I'm new to tensors and having a headache over this problem:

I have an index tensor of size k with values between 0 and k-1:

tensor([0,1,2,0])

and the following matrix:

tensor([[[0, 9],
     [1, 8],
     [2, 3],
     [4, 9]]])

I want to create a new tensor which contains the rows specified in index, in that order. So I want:

tensor([[[0, 9],
     [1, 8],
     [2, 3],
     [0, 9]]])

Outside tensors I'd do this operation more or less like this:

new_matrix = [matrix[i] for i in index]

How do I do something similar in PyTorch on tensors?

1 Answer 1

1

You use fancy indexing:

from torch import tensor

index = tensor([0,1,2,0])
t = tensor([[[0, 9],
     [1, 8],
     [2, 3],
     [0, 9]]])

result = t[:, index, :]

to get

tensor([[[0, 9],
         [1, 8],
         [2, 3],
         [0, 9]]])

Note that t.shape == (1, 4, 2) and you want to index on the second axis; so we apply it in the second argument and keep the rest the same via :s i.e. [:, index, :].

Sign up to request clarification or add additional context in comments.

6 Comments

@SebastianT.Vincent glad it worked! if you wish, consider accepting when time permits to signal others that the issue is resolved.
A follow up question: can we still use fancy indexing if the index tensor is 2- or n-dimensional? I have a data tensor [32,4,1000] and an index tensor [32,4] and doing data[index, :] yields a [32,4,4,1000] tensor!
@SebastianT.Vincent Yes fancy indexing is applicable to any n-dimensional array but I'm not quite sure about what the result should be in your case. Can you please elaborate on what should be the shape of the output?
I would like the shape of the output to be the same as input (i.e. [32,4,1000]). Let's call dim0 batch (32), dim1 beam (4). Within each batch, I would like the beams to be replaced according to the index matrix for that batch. So within batch 2, if col[2,:] is [0,1,2,0], then if beams for batch 2 are ABCD then I'd like them to be replaced with ABCA. Does that make sense?
@SebastianT.Vincent I see, here is I think a solution: data[np.arange(len(data)), index.T, :].permute(1, 0, 2). Since we need to apply a different index set for each batch, we index the batch dimension with a np.arange (which is equal to 0..31 for example). Then, since the beams are stacked in the columns of each batch, we need to select there in a [indices, :] fashion. But indices are (batch_size, beam_dim) so we need to transpose to align them. Last dimension is free, hence the :. At the end we permute the first and second dimension because of the transpose we had to do.
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.