I wonder if anybody tried to use FXAA before doing display mapping BUT by still applying and reverting a curve ? 🤔

Aka:
1 - Switch from HDR to SDR
2 - Apply FXAA
3 - Switch back from SDR to HDR
4 - Apply regular post-process stuff

Given that FXAA is meant to be working in non-linear space and focus on perceived contrasts, it would mean using a reversible curve that would produce colors ideally fitted for the human eye ? (So no log space or anything alike I presume)

Just tried out the idea with a version of Reinhard from here: https://github.com/microsoft/DirectX-Graphics-Samples/blob/master/MiniEngine/Core/Shaders/ToneMappingUtility.hlsli#L58

... and it kinda work ?
I need to do more experiments, but that's promising !

DirectX-Graphics-Samples/MiniEngine/Core/Shaders/ToneMappingUtility.hlsli at master · microsoft/DirectX-Graphics-Samples

This repo contains the DirectX Graphics samples that demonstrate how to build graphics intensive applications on Windows. - microsoft/DirectX-Graphics-Samples

GitHub

Unfortunately the curve in there loose too much range, so specular reflections get really dimmed and the bloom loose in intensity.

So I tried out this instead: https://gpuopen.com/learn/optimized-reversible-tonemapper-for-resolve/

Wasn't enough either alone, so I applied the same trick did for my LUTs to compress further the range and it seems to be working.

The fog gradient doesn't seem to suffer (it was a good indication of the precision loss previously) and edges are still anti-aliased !

Optimized Reversible Tonemapper for Resolve

Optimized tonemapper form of the technique Brian Karis talks about on Graphics Rants: Tone mapping. Replace the luma computation with max3(red,green,blue).

AMD GPUOpen
Going to try to move the FXAA back at the end of the pipe now to compare both mode and see if some stuff change in behavior.

Alright, got it working at both end, depending on a switch. So I was able to compare.

On regular geometry edges, visually almost no differences.

BUT, doing FXAA as the last step end produce noticeable differences because it misses aliasing that has been exaggerated by some effects.

Example with my chromatic aberration effect:

Yesterday I tried once again to optimize my SSAO pass in compute, and still failed. A fragment shader still performs quite a lot better.

So today I decided to play again with my bloom and lens-flare to tinker with other ideas. Like anamorphic shapes.

Not necessarily a success, but I got interesting results just by playing with some buffers size or UVs:

Back on Ombre... and I decided to play again with lens-flares (I know 🤪 ).

This time I wanted to try out the little radial projection trick from John Chapman article (https://john-chapman.github.io/2017/11/05/pseudo-lens-flare.html) to create fake streaks. It's a good start, but I will need to think about how to refine that effect. It looks nice already !

Screen Space Lens Flare

A few years ago I wrote a blog post describing a screen space process for rendering lens flares. I had first read about this idea on Matt Pettineo’s blog (1)...

Been tweaking my lens-flare again for the past few days and now reaching a point where I want to try some kind of anamorphic bloom.

Right now I went with a hack where I modify one of the downsample texture when it is fed for the upsample pass. It is giving me a rough idea of what to expect, but it's not good enough yet (not sharp enough, and some flickering issue to manage still).

Will likely need to do a proper downsample/upsample process too.

#gamedev #shader #postprocess #bloom

I tweaked a bit more and properly integrated my bloom streak pass in the engine.

Combined with the regular bloom and the lens-flare this is all coming together well ! :)

I couldn't stop at two bloom passes, so I added a third one to fake atmospheric scattering.

So... how much humidity do you want in the air ? 😄

It is based on: https://github.com/OCASM/SSMS
(But I'm planning on improving some things.)

#gamedev #shader #fog

GitHub - OCASM/SSMS: Screen space multiple scattering for Unity.

Screen space multiple scattering for Unity. Contribute to OCASM/SSMS development by creating an account on GitHub.

GitHub
I tweaked a bit further that fog blur and plugged in my fog function in it.
This way I can also use it to emulate height fog with it too. :)

This morning I also quickly tried to add some fake halation effect (light bleeding into darker areas).

It's basically a highpass filter using the bloom downsamples and the current scene color texture, and then isolating the bright parts to make them bleed into the dark areas.

Currently it's an additive blend done with the HDR color, so it adds light. It low enough to no matter too much. Maybe I should use a lerp too to be more energy preserving ?

Woops, I had a Saturate() in there when setting up the highpass. Now I get why my halation edges where so sharp ! 🙃

Also switched to a combination of mix/lerp for blending and it works as good as before. So no additional energy yeay !

Turns out the Love framework had a bug for a few months and wasn't loading sRGB texture properly.
Got fixed today after my report, so now colors match properly:

I didn't notice it until today, because I decided to draw a texture straight to the screen for a temporary loading screen.

All fixed, so it looks like this now:

My current struggle.

I'm already doing the firefly attenuation based on Jimenez slides.

I'm trying to think about possible solutions:
- Clamping max brightness ?
- Reducing emissive intensity based on distance ?
- Doing some temporal stabilization (like TAA but only for bloom/fog downsample) ?

I'm open to suggestions.

I gave a try at clamping (like @EeroMutka suggested) but as I expected, because I use a non-thresholded and energy preserving bloom method, clamping kills off the HDR range and bloom becomes non-existent.

Here is with and without clamping:

The current idea I wanna try is doing a copy of the first downsample (full or smaller res) and blend it into the next frame downsample. Just to see if it helps with the spatial/temporal aliasing.
Will figure out ghosting issues afterward if it becomes promising.
Weeeee !

First of all, this is very framerate dependent when using a fixed blend value.

Secondly, you need to weight the previous a lot to make the flicker not visible/disturbing, favoring a lot of ghosting.

Right now it's a stupid blend, so I wonder if re-projection would help a lot now. 🤔

Previous frame reprojection seems to be doing the trick !
(Combined with color clamping to hide disocclusion.)

Here is a comparison with off (blend at 1) and on (blend at 0.1). Flickering is almost gone and no ghosting seems to be visible.

It's basically TAA but on a blurry and half-resolution buffer.

So preserving details doesn't really matter. I don't even bother with jittering.

Transparency/emissive surface not writing into the depth buffer don't seem to suffer either. That's really cool because I was afraid of that !

This week I continued with my fog stuff and added local volumes of analytical fog.

It's going to be quite useful to make moody effects in scenes.

So far I got Sphere and Box shape working, but I'm thinking about doing cones (for spotlights) and maybe cylinders (for dirty liquids container or holograms).

#gamedev #screenshotsaturday #shader #fog

@froyok Clamp based on the picture formation limit.
@troy_s What do you mean by that ? I checked a bit around but I'm not sure how it translates in this context.
The root cause of the issue here is aliasing and temporal instability related to that (from one rendered frame to another).
@froyok Clip to the maximal quantised value of your “tone mapper”?

@troy_s If I follow you well, that won't help I'm afraid.

Issue here is not that one pixel is so bright it blows but more like on the previous frame is wasn't as bright as the current one. So when the blur happens, the energy of the pixel spreads further creating a visibly more wide halo/tail. And because it wasn't during the previous frame it leads to flickering.

@troy_s If I were to clamp before hand, on the HDR values directly, it would kill the range and reduce the spread of bright surfaces. Leading to smaller in size and bloom and less bright light (in term of perception).
@froyok Right. The temporal sampling basically? Tricky!
@froyok Is there a way to attenuate the energy with a temporal mean? I’d think a log encoded version of a slightly lower spatial frequency of the energy of the “frame” might be a reasonable baseline?

@troy_s I was thinking of something similar yeah, incorporating a temporal element.

Either an overall brightness value (maybe based on an average of all the pixels) from current frame and use that in next frame.

But this gets close to what Temporal Anti Aliasing do anyway. So I might do that directly.

@froyok I also tried the Brian Karris’ weighted average method for eliminating fireflies, but it just looked bad IMO. I ended up just doing min(value, 1) in the first downsampling pass and it works pretty well
@EeroMutka Interesting, I will have to give it a try 🤔
@froyok it's wet air day, so go all in!
@froyok For outdoors, heat haze would be a good canidate as well at the point depth is being sampled for simulated atmospheric scatter. Could even put in some extreme distance fresnel mirror effect for things like roads or sand out in the horizon.
@NOTimothyLottes Too bad that my project is about snowy conditions. But I like those ideas ! :D
@froyok Looks beautiful - remind me again, this won't be open sourced to follow along right? :D
@krisso I don't plan on open-sourcing the engine itself, but I have been thinking about writing articles about some of the effects it is using. :)
@froyok What ever you do, I'm very much looking forward to it! Your past articles have been super in-depth and a joy to read! Carry on! 😅

@froyok IMHO, it looks almost identical :)

I also think that reverting tone mapping sounds like a redundant processing step, and Reinhard is not the best curve out there. Check out this good blog about different tone mapping functions: https://bruop.github.io/tonemapping/

Tone Mapping

A guide to adding tone mapping to your physically based renderer