0

The gist of my problem is best described by this image:

Crono and his funny-looking level.

This is the beginnings of a 2D, top-down game. The track is randomly generated and each segment drawn as a quad, with the green edge color being offset by random noise in the fragment shader. It looks perfect as a static image.

The main character is always centered in the image. When you move it, the noise changes. That's not what I intend to happen. If you move the character, the image you see should just slide across the screen. The noise calculation is changing and the edges change as the image moves.

I should mention now that the track's vertices are defined by integer points (eg, 750, -20). The vertex shader passes the original vertices to the fragment shader, unmodified by the projection or camera offset. The fragment shader uses these values to color the segments, producing the image above.

Here's the fragment shader with a portion purposefully commented out:

#version 430 core

layout(location = 7) in vec4 fragPos;
layout(location = 8) in flat vec4 insideColor;
layout(location = 9) in flat vec4 edgeColor;
layout(location = 10) in flat vec4 activeEdges; // 0 = no edges; 1 = north; 2 = south; 4 = east, 8 = west
layout(location = 11) in flat vec4 edges; // [0] = north; [1] = south; [2] = east, [3] = west
layout(location = 12) in flat float aspectRatio;

layout(location = 1) uniform vec4 offset;

out vec4 diffuse;

vec2 hash2( vec2 p )
{
    return fract(sin(vec2(dot(p,vec2(127.1,311.7)),dot(p,vec2(269.5,183.3))))*43758.5453);
}

void main()
{
    vec2 r = hash2(fragPos.xy) * 30.0f;
    /*vec2 offset2 = floor(vec2(offset.x/2.001, -offset.y/1.99999));
    vec2 r = hash2(gl_FragCoord.xy - offset2.xy) * 10;*/
    float border = 10.f;

    float hasNorth = float((int(activeEdges[0]) & 1) == 1);
    float hasSouth = float((int(activeEdges[0]) & 2) == 2);
    float hasEast = float((int(activeEdges[0]) & 4) == 4);
    float hasWest = float((int(activeEdges[0]) & 8) == 8);

    float east = float(fragPos.x >= edges[2] - border - r.x);
    float west = float(fragPos.x <= (edges[3] + border + r.x));
    float north = float(fragPos.y <= edges[0] + border + r.y);
    float south = float(fragPos.y >= (edges[1] - border - r.y));

    vec4 c = (east * edgeColor) + (west * edgeColor) + (north * edgeColor) + (south * edgeColor);
    diffuse = (c.a == 0 ? (vec4(1, 0, 0, 1)) : c);
}

The "offset" uniform is the camera position. It is also used by the vertex shader.

The vertex shader has nothing special going on. It does simple projection on the original vertex and then passes in the original, unmodified vertex to fragPos.

void main()
{
    fragPos = vPosition;
    insideColor = vInsideColor;
    edgeColor = vEdgeColor;
    activeEdges = vActiveEdges;
    edges = vEdges;
    aspectRatio = vAspectRatio;
    gl_Position = camera.projection * (vPosition + offset);
}

With this setup, the noisy edges move a lot while moving the camera around.

When I switch to the commented out portion of the fragment shader, calculating noise by subtracting the camera position with some math applied to it, the north/south edges are perfectly stable when moving the camera left/right. When moving up/down, the east/west edges move ever so slightly and the north/south edges are rather unstable.

Questions:

  1. I don't understand why the noise is unstable when using the fragment shader-interpolated vertex positions. ([720,0] - [0,0]) * 0.85 should not change depending on the camera. And yet it does.
  2. Using the viewport-based fragment position modified by the camera position stabilizes the image, but not completely, and I'm at a loss to see why.

This is undoubtedly something stupid but I can't see it. (Also, feel free to critique the code.)

1 Answer 1

1

You're probably running into floating-point precision issues. The noise function you're using seems rather sensitive to rounding errors, and rounding errors do happen during all those coordinate transformations. What you could do to mitigate that is use integer arithmetic: The builtin variable gl_FragCoords contains the (half-)integer coordinates of the onscreen pixel as its first two coordinates. Cast those to integers and subtract the integer camera offset to get exact integer values you can insert into the noise function.

But this particular noise function does not seem like a good idea here at all, it's clunky and hard to use, and most importantly it doesn't look good at different scales. What if you want to zoom in or out? When that happens, all sorts of wacky things will happen to the border of your track.

Instead of trying to fix that noise function with integer arithmetic, you could simply use a fixed noise texture, and sample from that texture appropriately instead of writing a noise function. That way your noise won't be sensitive to rounding errors, and it'll stay identical when zooming in or out.

But I should note that I don't think what you're doing is an appropriate use of procedural shaders at all. Why not just construct a tileset of track piece tiles, and use that tileset to lay down some track using a vertex buffer filled with simple textured rectangles? It has several advantages over procedural fragment shaders:

  • You or your artist(s) get to decide how exactly the track and its border looks, maybe with proper blades of grass or whatever the border is supposed to be, maybe some pebbles or cobblestones, you know, the sort of details you could see on such a track, and freely pick the color scheme, without having to do super advanced shader wizardry.

  • Your track is not limited to rectangular shapes, and can connect easily.

  • Your track does not require a special-purpose shader to be enabled. Seriously, don't underestimate the value of keeping your shaders nonspecific.

  • The tileset can trivially be extended to other pieces of scenery, while the custom shader cannot.

Sign up to request clarification or add additional context in comments.

3 Comments

I'm keeping the art simple until the time comes when I don't think it works. One of my design goals is simplicity when I can get away with it. I'm the only person working on this and that will probably stay. This project existed in a few other versions (at one point in Unity). That said, I did consider during this problem using textures instead of a shader but that would make the fragment shader more difficult. The segments are of arbitrary width and length but there is a finite "block size": the length and width are a multiple. I have to think about that one.
That's not simplicity, you're really truly overcomplicating things with your shaders. Here's a complete fragment shader for tilesets: in vec2 fragUV; uniform sampler2D tileset; out vec4 diffuse; void main() {diffuse = texture(tileset, fragUV);}
I got a perlin noise texture (vs something quite random like the original) so I can get a better view of it changing. It's a lot more stable. It probably is floating point precision. That said, I'm gonna have to think about using textures. They'll probably be procedurally generated; I'm trying to see if a certain 2 or 3 color aesthetic works.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.