0

I am trying to get cuda to work but I need to change my training input into a tensor. When I tried to do that, I am getting an error when I tried to stack a list of tensor into one tensor.

Code

for epoch in:
        alst = []
        for x, y in loader:
            x = torch.stack(x)
            #x = torch.Tensor(x)
            #x = torch.stack(x).to(device,dtype=float)

Shape of x: List of tensors

[tensor([[[0.325],
     [ 0.1257],
     [ 0.1149],
     ...,
     [-1.572],
     [-1.265],
     [-3.574]],
]), tensor([1,2,3,4,5]), tensor(6,5,4,3,2])]

Error I got

     22             alst = []
     23             for x, y in loader:
---> 24                 x_list = torch.stack(x)
     25 #                 x = torch.Tensor(x)
     26 #                 x = torch.stack(x).to(device,dtype=float)

RuntimeError: Expected object of scalar type Float but got scalar type Long for sequence element 1 in sequence argument at position #1 'tensors'

Not sure what I am doing wrong. I tried x = torch.stack(x).to(device,dtype=float) as well but it still didn't work.

1 Answer 1

1

First tensor in your output is of float type with values to input your network with, second looks like labels (of type long).

Furthermore the first one is a tensor while the second and third element are vectors (with 6 and 9 elements respectively).

You cannot stack tensors of different shape hence this won't work no matter the types.

Unpack your x via

matrix, vector1, vector2 = x

To remove type warnings cast vector1 and vector2 to float via

vector1 = vector1.float()

Check their shapes via .shape attribute and act accordingly. Probably though you already have batches of data as you are using train_loader. See: DataLoader documentation for more information and check whether you are using one.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.