A 🧵 about my PhD student (Zack Misso’s) upcoming #SIGGRAPH2023 #research #paper on progressive #nullTracking for #production #volumeRendering (in collaboration with @yiningkarlli and others at #DisneyAnimation).

The project page is now live at:

https://cs.dartmouth.edu/~wjarosz/publications/misso23progressive.html

Progressive null-tracking for volumetric rendering

Null-collision approaches for estimating transmittance and sampling free-flight distances are the current state-of-the-art for u...

Wojciech Jarosz
This project was the result of Zack doing an internship with #WDAS last summer. He explored improvements to volume rendering for the next generation of Disney’s Hyperion renderer, which they have used to render all their movies since Big Hero 6.
Hyperion uses null tracking to render volumes. Variants of these techniques go by many names like Woodcock tracking, delta scattering, delta tracking, ratio tracking, etc. There has been lots of work on these techniques over the last decade.
Heterogeneous media (where density and other properties vary across space) are notoriously hard to render using Monte Carlo. Null tracking techniques make this practical by pretending the volumes are actually homogeneous. You can think of this as injecting additional null (or “fictitious”) density where the real density is low, so that the sum of the null and real densities across space is homogeneous. This combined density is often called the “majorant”.
This majorant can theoretically be set to any positive value. Typically we choose a value that is larger than the density of the medium at any location in space (hence the name “majorant”). If we set it higher, it will just take longer to render. So we'd like to keep it low.

Theoretically, we can choose values that are smaller than the maximum density, but this quickly leads to really high variance.

This is one of the core challenges of null tracking methods: we’d like to set the majorant to the maximum density of the medium, and no higher -- so that both variance and render times are low.

If the medium is defined by a voxel grid, this is trivial: just iterate over all the voxels and compute the maximum value.

But in production, volumes are often the combination of several voxel grids, with additional procedural modifications (e.g. fractal noise) all authored as a complex node graph. This means we cannot generally find the exact maximum density.

Due to this issue, Disney’s approach so far has been to bake all their (potentially procedural) volumes down to a single voxel grid. This allows finding robust majorants, but is incredibly heavy-weight, limits the resolution of volumes, and increases “time to first pixel”.

Our paper solves this problem, and the key idea is extremely simple.

We just take a guess at a majorant. This guess doesn't have to be correct, or even close to correct. It could in fact just be a super tiny value to start off with.

Given the current guess, we *clamp* the density of the medium to this current majorant. By clamping the medium, the majorant becomes bounding, so variance will be low, but we've changed the appearance of the medium (by making it lower density) which gives us a *biased* rendering.

That's ok though, while rendering the first image, we end up sampling the medium at various locations, and can incrementally learn a new majorant (by e.g. taking the maximum density of all previous lookups). We can then render another image with an updated majorant. The medium densities are clamped again, but to this new improved majorant. The result may still be biased, but less so.

We do this repeatedly, averaging all renderings so far (much like in progressive photon mapping). In each pass we may have bias, but it will gradually go down. In the paper we show that if we update the majorant in the right way, we are *guaranteed* to find and maintain a tight bounding majorant within a *finite* number of iterations. Once we find a bounding majorant, each subsequent render pass will be unbiased, and the average of all passes, while biased, is guaranteed to converge to the true answer (the method is consistent).

In practice, we found that a bounding majorant is found in just a few passes, and the bias goes away very quickly.

This method is incredibly simple, and doesn't require changing any existing tracking code (just clamping the medium densities).

We still plan to incorporate the technique into the next version of our Hyperion renderer, but have released an open-source prototype built upon PBRT.

Come see Zack's talk next week at #SIGGRAPH to find out all the details and drop by for questions at his poster.

@wjarosz ha, clever!
@wjarosz to expand on that: I remember a while ago where major conferences would often reject papers for "this is merely incremental improvement, not groundbreaking enough" basis. And while you absolutely need some groundbreaking papers, "merely incremental improvement" that is easily pluggable into existing system is *goldmine* for real production systems;

@wjarosz ...since in reality you can only afford to have a "groundbreaking rewrite" of your rendering stack every decade or so. In the meantime, "incremental improvements" is the only thing you can afford to plug in, since both time and budget (and potential disruptions to workflow/pipeline) are very hard consrtaints.

Yay for incremental improvements!

@aras @wjarosz

We spent the chip for that "groundbreaking rewrite" of our volume rendering stack about 5 years ago, so yup, yay for incremental improvements! Although as far as incremental improvements go, IMO this is a pretty big one since it lifts one of the longstanding fundamental restrictions of null tracking techniques. The real brilliance in Zack's work here is in how it removes this foundational restriction with a very simple, straightforward, drop-in technique.

@yiningkarlli @wjarosz oh yeah, the point I was trying to make that sometime in the past "oh but it's simple" sometimes was a reason for paper rejection. But in reality, "oh but it's simple" is *most excellent* quality!
@wjarosz Nice, and much appreciated breakdown!
@wjarosz thank you for the thread. Brilliant stuff!
@wjarosz thanks for a super detailed thread! I was planning on catching up on things like delta tracking on my upcoming vacation (my experience with volumetrics was only with old, biased methods), and this is a great, approachable intro - while covering novel research! :)
@BartWronski glad to hear it was approachable and informative!