mh. stairstep effects in in shadow mapping are just a symptom of inaccurate z-buffer rasterization. the z-value chosen per pixel should be the maximum depth of the intersection of frustum pixel and triangle, not the average or center value. then the stairstep effects will vanish. applying a bias instead is a terrible solution.

you can see below how the error is caused by taking the midpoint as depth; if the max depth were taken (red lines) we'd have no issue.

if we had normal information for each pixel in the shadow map, then taking the center depth value would probably be sufficient, as the plane at that point can be reconstructed from the normal, and we can test points against per-pixel planes. then a much smaller bias can be applied.

also, from the same plane, the max depth value can be reconstructed.

@lritter But that's what normal biasing is supposed to be for. Of course it's usually applied at the per-vertex level, so it's not super accurate. PolygonOffset / DepthBias also exists which the slope factor could theoretically be abused to fix this, if the implementation was consistent. Or calculate it manually in the fragment shader and use depth writes. But then you're giving up a ton of performance by adding pixel shaders to the shadow map generation.
@bgolus yeah. all this made me realize i just want to get rid of shadow maps for my game and instead focus on more plausible superdiffuse GI
@lritter The world of shadow maps is littered with "fix your shadow mapping problems with this one neat trick", where that "trick" adds 5 new problems.
@lritter And everyone just goes back to PCF.