The gist of my problem is best described by this image:
This is the beginnings of a 2D, top-down game. The track is randomly generated and each segment drawn as a quad, with the green edge color being offset by random noise in the fragment shader. It looks perfect as a static image.
The main character is always centered in the image. When you move it, the noise changes. That's not what I intend to happen. If you move the character, the image you see should just slide across the screen. The noise calculation is changing and the edges change as the image moves.
I should mention now that the track's vertices are defined by integer points (eg, 750, -20). The vertex shader passes the original vertices to the fragment shader, unmodified by the projection or camera offset. The fragment shader uses these values to color the segments, producing the image above.
Here's the fragment shader with a portion purposefully commented out:
#version 430 core
layout(location = 7) in vec4 fragPos;
layout(location = 8) in flat vec4 insideColor;
layout(location = 9) in flat vec4 edgeColor;
layout(location = 10) in flat vec4 activeEdges; // 0 = no edges; 1 = north; 2 = south; 4 = east, 8 = west
layout(location = 11) in flat vec4 edges; // [0] = north; [1] = south; [2] = east, [3] = west
layout(location = 12) in flat float aspectRatio;
layout(location = 1) uniform vec4 offset;
out vec4 diffuse;
vec2 hash2( vec2 p )
{
return fract(sin(vec2(dot(p,vec2(127.1,311.7)),dot(p,vec2(269.5,183.3))))*43758.5453);
}
void main()
{
vec2 r = hash2(fragPos.xy) * 30.0f;
/*vec2 offset2 = floor(vec2(offset.x/2.001, -offset.y/1.99999));
vec2 r = hash2(gl_FragCoord.xy - offset2.xy) * 10;*/
float border = 10.f;
float hasNorth = float((int(activeEdges[0]) & 1) == 1);
float hasSouth = float((int(activeEdges[0]) & 2) == 2);
float hasEast = float((int(activeEdges[0]) & 4) == 4);
float hasWest = float((int(activeEdges[0]) & 8) == 8);
float east = float(fragPos.x >= edges[2] - border - r.x);
float west = float(fragPos.x <= (edges[3] + border + r.x));
float north = float(fragPos.y <= edges[0] + border + r.y);
float south = float(fragPos.y >= (edges[1] - border - r.y));
vec4 c = (east * edgeColor) + (west * edgeColor) + (north * edgeColor) + (south * edgeColor);
diffuse = (c.a == 0 ? (vec4(1, 0, 0, 1)) : c);
}
The "offset" uniform is the camera position. It is also used by the vertex shader.
The vertex shader has nothing special going on. It does simple projection on the original vertex and then passes in the original, unmodified vertex to fragPos.
void main()
{
fragPos = vPosition;
insideColor = vInsideColor;
edgeColor = vEdgeColor;
activeEdges = vActiveEdges;
edges = vEdges;
aspectRatio = vAspectRatio;
gl_Position = camera.projection * (vPosition + offset);
}
With this setup, the noisy edges move a lot while moving the camera around.
When I switch to the commented out portion of the fragment shader, calculating noise by subtracting the camera position with some math applied to it, the north/south edges are perfectly stable when moving the camera left/right. When moving up/down, the east/west edges move ever so slightly and the north/south edges are rather unstable.
Questions:
- I don't understand why the noise is unstable when using the fragment shader-interpolated vertex positions. ([720,0] - [0,0]) * 0.85 should not change depending on the camera. And yet it does.
- Using the viewport-based fragment position modified by the camera position stabilizes the image, but not completely, and I'm at a loss to see why.
This is undoubtedly something stupid but I can't see it. (Also, feel free to critique the code.)
