45

I'm currently working with the PyTorch framework and trying to understand foreign code. I got an indices issue and wanted to print the shape of a list.
The only way of doing so (as far as Google tells me) is to convert the list into a numpy array and then getting the shape with numpy.ndarray.shape().

But trying to convert my list into an array, I got a ValueError: only one element tensors can be converted to Python scalars.

My List is a converted PyTorch Tensor (list(pytorchTensor)) and looks somewhat like this:

[
tensor([[-0.2781, -0.2567, -0.2353,  ..., -0.9640, -0.9855, -1.0069],  
        [-0.2781, -0.2567, -0.2353,  ..., -1.0069, -1.0283, -1.0927],  
        [-0.2567, -0.2567, -0.2138,  ..., -1.0712, -1.1141, -1.1784],  
        ...,  
        [-0.6640, -0.6425, -0.6211,  ..., -1.0712, -1.1141, -1.0927],  
        [-0.6640, -0.6425, -0.5997,  ..., -0.9426, -0.9640, -0.9640],  
        [-0.6640, -0.6425, -0.5997,  ..., -0.9640, -0.9426, -0.9426]]),

tensor([[-0.0769, -0.0980, -0.0769,  ..., -0.9388, -0.9598, -0.9808],  
        [-0.0559, -0.0769, -0.0980,  ..., -0.9598, -1.0018, -1.0228],    
        [-0.0559, -0.0769, -0.0769,  ..., -1.0228, -1.0439, -1.0859],  
        ...,  
        [-0.4973, -0.4973, -0.4973,  ..., -1.0018, -1.0439, -1.0228],  
        [-0.4973, -0.4973, -0.4973,  ..., -0.8757, -0.9177, -0.9177],  
        [-0.4973, -0.4973, -0.4973,  ..., -0.9177, -0.8967, -0.8967]]),

tensor([[-0.1313, -0.1313, -0.1100,  ..., -0.8115, -0.8328, -0.8753],  
        [-0.1313, -0.1525, -0.1313,  ..., -0.8541, -0.8966, -0.9391],  
        [-0.1100, -0.1313, -0.1100,  ..., -0.9391, -0.9816, -1.0666],  
        ...,  
        [-0.4502, -0.4714, -0.4502,  ..., -0.8966, -0.8966, -0.8966],  
        [-0.4502, -0.4714, -0.4502,  ..., -0.8115, -0.8115, -0.7903],  
        [-0.4502, -0.4714, -0.4502,  ..., -0.8115, -0.7690, -0.7690]]),
] 

Is there a way of getting the shape of that list without converting it into a numpy array?

2
  • 7
    Not sure if you had the same issue but I was trying to convert a nested list of torch tensors to a bigger tensor that respected the nesting by having more indices/dimensions for the final tensor. I found that incrementally growing the list of torch tensors and then to torch.stack it helped. I wrote a non recursive example but I think it should be easy to extend it. Hope it helps: stackoverflow.com/questions/54307225/… Commented Feb 4, 2021 at 17:33
  • @CharlieParker I was trying to do the same thing. it should work. but it doesn't!! Commented Apr 30, 2021 at 4:11

3 Answers 3

39

It seems like you have a list of tensors. For each tensor you can see its size() (no need to convert to list/numpy). If you insist, you can convert a tensor to numpy array using numpy():

Return a list of tensor shapes:

>> [t.size() for t in my_list_of_tensors]

Returns a list of numpy arrays:

>> [t.numpy() for t in my_list_of_tensors]

In terms of performance, it is always best to avoid casting of tensors into numpy arrays, as it may incur sync of device/host memory. If you only need to check the shape of a tensor, use size() function.

Sign up to request clarification or add additional context in comments.

Comments

7

The simplest way to convert pytorch tensor to numpy array is:

nparray = tensor.numpy()

Also, for size and shape:

tensor_size = tensor.size()
tensor_shape = tensor.shape()
tensor_size
>>> (1080)
tensor_shape
>>> (32, 3, 128, 128)

Comments

5

A real-world example, would require to handle torch no grad issue:

with torch.no_grad():
    probs = [t.numpy() for t in my_tensors]

or

probs = [t.detach().numpy() for t in my_tensors]

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.