Monday, February 9, 2015

Using the depth buffer to sort tiles in opengl (objective c/ios)

So I've made a breakthrough in my custom tile engine... previously I was pathalogically quicksorting the draw order of hundreds of onscreen tiles. I managed to do figure out how to NOT do that, and use the depth buffer instead. Here's what I did to save others from pain in the ass research, trial, and error.

First i set the GLKView drawable depth format:

    GLKView *view = (GLKView *)self.view;
    ...
    view.drawableDepthFormat = GLKViewDrawableDepthFormat24;

Prior to drawing anything, we clear the screen setting the bit for depth:
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

Then, for all map layers that have depth (eg. sprite layer), I enable depth testing, and set the depth function to GL_LEQUAL:

            glEnable( GL_DEPTH_TEST );
            glDepthFunc(GL_LEQUAL);

For layers that don't have depth (eg. floor), disable depth testing:
            glDisable( GL_DEPTH_TEST );

Then in my fragment shader, I avoid rendering pixels which are less than 0.5 transparent:

void main(void) {
    lowp vec4 tex = texture2D ( Texture, TexCoordOut );
    if ( tex.a <= 0.5) {
       discard;
    }

   gl_FragColor =  tex;
}

Finally, when sending vertices describing my triangle strips, I use a vector with 3 elements:
    GLKVector3 geometryVertex;

I set the .z value of the vertex to the 'depth' of my particular tile. Farther away tiles I assign less value; closer in tiles, I assign high values. The tiles closer in properly occlude the tiles farther away; thanks to the vertex shader we don't allow transparent pixels to occlude non transparent pixels (we simply don't draw them).

No comments:

Post a Comment