0

What I want to do is something like this:

import torch 
a = torch.arange(120).reshape(2, 3, 4, 5)
b = torch.cat(list(a), dim=2)

I want to know:

  1. I have to convert tensor to a list, will this cause performance not good?
  2. Even performance is OK, can I do this just with tensor?

1 Answer 1

2

You want to:

  1. Reduce the number of copies: in this specific scenario, copies need to be made since we are rearranging the layout of our underlying data.

  2. Reduce or remove any torch.Tensor -> non-torch.Tensor conversions: this will be a pain point when working with a GPU since you're transferring data in and out of the device.

You can perform the same operation by permuting the axes such that axis=0 goes to axis=-2 (the before the last axis), then flattening the last two axes:

>>> a.permute(1,2,0,3).flatten(-2)
tensor([[[  0,   1,   2,   3,   4,  60,  61,  62,  63,  64],
         [  5,   6,   7,   8,   9,  65,  66,  67,  68,  69],
         [ 10,  11,  12,  13,  14,  70,  71,  72,  73,  74],
         [ 15,  16,  17,  18,  19,  75,  76,  77,  78,  79]],

        [[ 20,  21,  22,  23,  24,  80,  81,  82,  83,  84],
         [ 25,  26,  27,  28,  29,  85,  86,  87,  88,  89],
         [ 30,  31,  32,  33,  34,  90,  91,  92,  93,  94],
         [ 35,  36,  37,  38,  39,  95,  96,  97,  98,  99]],

        [[ 40,  41,  42,  43,  44, 100, 101, 102, 103, 104],
         [ 45,  46,  47,  48,  49, 105, 106, 107, 108, 109],
         [ 50,  51,  52,  53,  54, 110, 111, 112, 113, 114],
         [ 55,  56,  57,  58,  59, 115, 116, 117, 118, 119]]])
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.