(I would like to say right away that I personally am not a programmer and may not be so competent in what is said below.)
My friend and I are creating a small "game" engine. We have a task with the implementation of some VFX, which assume that the created 3d model with a certain Uv-map + sprite texture will be involved.
(A couple of photos that will help you understand what we are talking about)
In order to make the scrolling animation of our sprite texture on a 3D model (VFX), we decided to move our UV map using attributes during initialization, and yes, it worked, but we had a problem supporting shaders. After trying to figure out the problem a little bit, we came to the conclusion that: Shaders are working with each pixel individually, the shader uses 2 numbers per 1 pixel, and an entire array is transmitted through attributes (as I understand it: work happens with each pixel in any effect, and each effect requires a UV-map and changes his as he needs it) After trying to continue to understand the problem even more, we thought that in general, the transmission of UV-maps does not occur in real time, and that in fact UV-maps in shaders are constants. As a result, it still does not allow us to use shaders correctly.
I know that such technologies have long been implemented in engines such as UE or Unity, but we could not find information on how we should do this in our engine, it is possible that we generally made a mistake in our approach to implementing this system. Could you tell us how to solve this problem?

