I'm attempting to load a sequence of frames from a GIF into a Texture2DArray using stbi_load_gif_from_memory which returns a pointer to the texture data. I'm stuck on what to do for offsetting into the texture data in order to split it up into the array of D3D11_SUBRESOURCE_DATA objects:
Image img = Image(filepath); //Custom class that loads a .gif using stbi_load_gif_from_memory.
D3D11_TEXTURE2D_DESC tex_desc{};
tex_desc.Width = img.GetDimensions().x; //width...
tex_desc.Height = img.GetDimensions().y; //...and height of image in pixels.
tex_desc.ArraySize = img.GetDimensions().z; //...Depth, i.e. number of frames
tex_desc.MipLevels = 1;
//...Snip setting up D3D11_TEXTURE2D_DESC
// Setup Initial Data
std::vector<D3D11_SUBRESOURCE_DATA> subresource_data{};
subresource_data.resize(img.GetDimensions().z);
const auto width = img.GetDimensions().x;
const auto height = img.GetDimensions().y;
for(int i = 0; i < img.GetDimensions().z; ++i) {
subresource_data[i].pSysMem = img.GetData(); //Return a pointer to the front of the texture data array.
//The above is INCORRECT. It sets *every* frame to the first!
subresource_data[i].SysMemPitch = width * sizeof(unsigned int); // pitch is byte size of a single row)
subresource_data[i].SysMemSlicePitch = static_cast<unsigned long long>(width) * static_cast<unsigned long long>(height) * sizeof(unsigned int);
}
```