I am excited to finally share our recent paper "Filtering After Shading With Stochastic Texture Filtering" (with @mattpharr @marcosalvi and Marcos Fajardo), published at ACM I3D'24 / PACM CGIT, where we won the best paper award! 1/N
"Everyone" knows blending and filtering do not commute with non-linear functions.
However, this is how texture filtering is taught and applied - we filter textures, then "shade" (apply non-linear functions). This introduces bias and error and often destroys the appearance. 2/N
We reviewed 40y of graphics literature and unify the theory to propose "filtering after shading".
To make it practical and fast, we realize it through stochastic filtering and propose unbiased Monte Carlo estimators, together with two families of low variance methods. 3/N
Many practitioners have used stochastic filters, but we generalize them, expand to negative lobe filters and infinite kernels, and propose an efficient way of sampling B-spline kernels.
We discuss the limitations of those techniques and cases where we do not recommend FAS. 4/N
Those limitations exist, but we are excited for the possibilities our framework unlocks - not just "correctness" and appearance preservation, but better filters (no more ugly bilinear!), application to blending, novel texture compression formats, and pipeline simplifications! 5/N
I think it's time we change how we teach and approach filtering textures.
Curious?
Check our paper and presentation slides: https://research.nvidia.com/labs/rtr/publication/pharr2024stochtex/ .
We also made shadertoys demonstrating two families of stochastic techniques: https://www.shadertoy.com/view/clXXDs https://www.shadertoy.com/view/MfyXzV 6/6
Filtering After Shading with Stochastic Texture Filtering | NVIDIA Real-Time Graphics Research

2D texture maps and 3D voxel arrays are widely used to add rich detail to the surfaces and volumes of rendered scenes, and filtered texture lookups are integral to producing high-quality imagery. We show that applying the texture filter after evaluating shading generally gives more accurate imagery than filtering textures before BSDF evaluation, as is current practice. These benefits are not merely theoretical, but are apparent in common cases. We demonstrate that practical and efficient filtering after shading is possible through the use of stochastic sampling of texture filters.<p>Stochastic texture filtering offers additional benefits, including efficient implementation of high-quality texture filters and efficient filtering of textures stored in compressed and sparse data structures, including neural representations. We demonstrate applications in both real-time and offline rendering and show that the additional error from stochastic filtering is minimal. We find that this error is handled well by either spatiotemporal denoising or moderate pixel sampling rates.

NVIDIA Real-Time Graphics Research

PS. If you read a tech report, earlier paper version - I recommend reading the new one. We improved it substantially - turning one small conference "rejection" into ACM conference "best paper" - and discovered new theory, limitations, and practical advice. :)

PS.2. Writing this paper, we discovered a DSP explanation of something puzzling me for a decade - why literature and practice recommend upsampling in sRGB/gamma, not linear? See the paper for details! I might blog about it as well. :)

somehow tagging @mattpharr didn't work in the first toot, fixing it now 😅
@BartWronski great job on the come back story!
@demofox is review process random and do I think 2 out of 3 reviewers misjudged our paper (one clearly has not read, the other one skimmed)? Yes.
Did it motivate us to make it better and clearer by understanding why we were mis-judged? Also yes. :)
@BartWronski haha - that feeling when you have two reviewer #2s.
@demofox if someone does not understand me and my claims of contributions, it can be their fault (not much effort put into trying), but most of the time, it's also my fault for not communicating it properly and an opportunity to improve :)
In hindsight, I am happy about it, as I believe the paper has a much higher chance of being impactful now.

@BartWronski just finished reading this today after your excellent presentation last week at i3d.

Great work! The paper is very clear and well written and the ideas are really exciting.

@BartWronski haven't dug in yet but this correlates an intuition I've had for a while about source art assets: the line quality of illustrations is preserved better if it's authored at a higher res and lower color depth because the shape information isn't decimated by AA across successive edits.
@BartWronski does this also unify min/max filtering?
@lritter I don't think it can, as min/max operator cannot be estimated through MC estimators or sample averaging. You could store something in some history buffer, maybe, but then it'd be a new, different technique :)
@BartWronski I'm gonna be honest I didn't know that
@aeva @BartWronski Didn't know either! I wonder why this isn't communicated *anywhere*, seems like a pretty important detail.
@jesta88 @aeva @BartWronski I agree that this point has been under-emphasized/not-mentioned (including in other things I've written myself!). I think it's just one of those things where much of the time it doesn't cause any trouble, and when it does it's easy to think "well, there's nothing that can be be done about it" or "it's probably not too much error anyway". I definitely agree with Bart that going through another cycle to clarify these observations was a really good thing for the paper...

@mattpharr @jesta88 @aeva I put this "everyone" in quotation as tongue-in-cheek suggestion that it's not really everyone, and it's not that obvious - because of the way we teach and practice it. :)
"Bilinear is free? Sure, I'll use it! Everyone uses it? It must be the right thing to do!"

Once it's not free (our motivation - filtering under a new compression format) and you see the different outcome, you are starting to question yourself and everything you know - "which way is correct?!" :)

@BartWronski @mattpharr @jesta88 oh yeah I figured the scare quotes were because it referred only to the set of people who knew this particular thing
@BartWronski Congratulations :-) Going to check this out today, thank you for sharing the work and for your research!
@BartWronski So nice! congratulations 😊
@BartWronski ah, I remember a year or two ago you were asking around about Unreal jittered sampling and other “strange” texture filtering approaches. Now we know where all that went! Really nice!

@aras yes, we had a tech report with our initial findings and a ton of folks reported some great precedents in old games. We knew of all the academic literature, but game developers just use them and often not even report. :)
The coolest example was this old Star Trek game and the first Unreal, we had no idea! This helped us a lot to contextualize our research. :)

Game developers, please report your findings and even "hacks"! :)

@aras Even if writing a full paper might seem intimidating and a ton of work (plus sometimes dealing with gatekeeping reviewers), GDC or Siggraph "Advances" presentations, blog posts, JCGT articles or arXiv tech reports are good enough to find and reference and much easier to write. :)
@BartWronski there’s at least 10x effort (and prestige?) difference between a blog post and a GDC/Siggraph talk, but yes even a blog post is 1000x better than nothing. From personal experience though, “hey I found a gross hack!” the first instinct is to *not* write about it :) But of course you have no idea if your “gross hack” is actually a sensible application of a theory that has not been formulated yet.
@aras @BartWronski "a sensible application of a theory that has not been formulated yet" is pretty much all particle physics since 1950s 😄
@aras reminds me of one talks at Stephen's and Steve's course where someone from the film industry explained how in the first Toy Story, artists requested a "hack" for diffuse power remapping to make lighting look more natural. Later it turned out that artists intuitively compensated for the lack of gamma correction. :) @self_shadow do you remember which talk it was?
@aras also, I don't think most gamedevs find a "real" paper in some journal or a more "academic" conference any more prestigious than a GDC talk, from my anecdotal experience - it's the other way around. :)
@BartWronski true from gamedev side, but feels like complete opposite from “science” side. Like half a year ago, I toyed with Gaussian Splats wrt data compression, but these were not “papers”. How many of the 100+ new 3DGS papers mentioned it? About one, out of like 20 that were about data compression ;)

@aras yeah and some of them had even worse compression ratios than yours :( visibility of blog posts is close to zero for non-academics. And I even had one academic explicitly refusing to cite my blog post about a similar method as their paper "because it's not peer reviewed". :(

FWIW, I think if you wrote a LaTeX version of your post, just put it on arXiv with references - you'd get a ton of citations. And at many conferences, chance of a "best paper" award. :)

@aras arXiv (even not peer reviewed, just a "preprint") is better than a blog post, as it will show in Google Scholar, including back-references from citations. Academics often look at some paper, look who cited it, and read and cite those.
@BartWronski oh yeah I’m very aware of how and why that happens. My wife’s a professor, and “citation indexes” are a real thing there that can determine your employment status etc. Citing blog posts is not incentivized in any way, and because you get what you incentivize… well that’s what you get. It’s just funny from the outside :)
@aras @BartWronski it sure is. I've tried to cite posts in both STBN and now FAST noise papers, and got a lot of push back on both papers. It's interesting cause the "real competitors" we ought to compare against live in blog posts IMO. (not as much push back at i3d btw Bart! that was nice.)
@aras @BartWronski like i have it on my personal blog todo list that i need to compare FAST vs these 2 types of noise.
https://tellusim.com/improved-blue-noise/
https://acko.net/blog/stable-fiddusion/
Improved Blue Noise - Tellusim Technologies Inc.

Tellusim Technologies Inc. Improved Blue Noise

@aras sonetimes it's incentives (your reviewers will be academics, so you better cite them all to not offend any potential one 😅), but discoverability is a real issue as well...
@BartWronski @aras Yeah, a lot of people’s literature search is just crawling up and down the citations tree, which blog posts typically aren’t a part of
@tomfinnigan @aras yeah, and putting something on arXiv makes you part of this citation tree. 90% of new CS/CV/ML papers submit there before final publication and already get cited.
Putting something together in LaTeX and submitting there is not much work (if you need "vouching" before submission to the arXiv CS.GS group, I am happy to recommend you! :) ), especially since Overleaf got really good recently (including some WYSIWYG editing!) and it's free for a single user and small projects.
@BartWronski @aras I’m fairly certain that was “Art Direction within Pixar’s Physically Based Lighting System” (Ian Megibben & Farhez Rayani). Sadly we never got permission to post final slides at the time. I should really ping Ian again.
@BartWronski your paper is a perfect blend of “proper literature” and “documenting gamedev practices” by the way. The latter is very often not well documented or even understood (I’m sure you are aware of a million reasons why :)). But it is curious that production environment sometimes stumbles upon actually sound theory by accident, without realizing it.
@aras I had this observation in an older blog post of mine that artists manually sharpening mipmaps (which seemed like a gross hack) is actually an intuitive compensation for ugly bilinear filter and correct, best-fit optimization-based solution gives similar results :) https://bartwronski.com/2021/07/20/processing-aware-image-filtering-compensating-for-the-upsampling/
Processing aware image filtering: compensating for the upsampling

This post summarizes some thoughts and experiments on “filtering aware image filtering” I’ve been doing for a while. The core idea is simple – if you have some “fixed” step at the…

Bart Wronski
@BartWronski @aras artists are smarter than we give them credit for. I'm pretty sure incorrect lighting falloff was making up for the renders not being sRGB correct :P
@demofox @BartWronski definitely. But my point is, in gamedev (or generally outside of "research"), many of these things are not because someone wanted to find a theory; they are because someone wanted to save half a millisecond. I'm 99% sure stochastic mip sampling happened in gamedev only because of in a virtual texturing system manually doing full trilinear is very costly. Someone had an idea of random mip choice, and went "hey that does not look too bad!" and so it shipped.
@demofox @BartWronski which again is why I'm very happy for paper like this (and a handful of others) that "bring industries together". I think Bart mentioned it too, but many graphics people are blissfully unaware of most of signal processing things done by audio people, for example. It might be useful! (or it might not, lol)
@demofox @BartWronski I'm totally fanboying Bart here though -- since you have experience in gamedev *and* research *and* music -- 🤯 -- excellent! ❤️
@aras @demofox thank you, it means a ton coming from you :) I am nerdy about some niche and mostly useless things, but sometimes, they turn out to be useful after all 😅
@aras @BartWronski Yeah, Bart is a mensch haha. & I love that the best research comes from people that are well versed in both worlds.
@demofox @aras switching fields for almost 5y helped me a lot :) I would kind of recommend it to everyone (assuming they are ok with re-starting almost from scratch...). The terminology difference was funny ("what the hell is optical flow? oh, you mean motion vectors?"), but I contributed knowhow from games and graphics to some CV/ML research and camera products. :)
"What do you mean games already do robust temporal multi-frame super-resolution???"
@BartWronski @demofox "What do you mean games already do robust temporal multi-frame super-resolution???" is funny. It goes the other way too though, like half of all the ECS stuffs within gamedev are along the lines of "ok so you've invented an extremely limited relational database, right" and they go "what's a relational database?" :)
@BartWronski @demofox complete topic jump, but thanks Bart and Alan for this "discussion" right here. Since 2020 I've been pretty much sitting at home and working in isolation, and right now I'm on the brink of spiralling into some depression episode and/or realizing how much I miss in-person discussions. This one is not in-person, but still. Thanks.
@aras @demofox I work from home, but my team is scattered across various timezones, so totally relatable. I get great discussions with them, but still sometimes feel alone (I'm extraverted). Discussions here help - and Aras, please come to some conference if you can! :)

@BartWronski @aras @demofox yep, I've done two work from home stints (one in 2010, another in 2020), and I burned out both times.

Aras, take care of yourself, it's easy to end up in a bad spot

About all the folks saying "come visit!" around here, every rendering team on the planet would love to have you show up at their office for a week and have random discussions over lunch.

You could really start the nerdiest world tour ever. And we'd all love it 😄

@jon_valdes @BartWronski @aras @demofox indeed, that's a pretty awesome idea. And I'm sure it could even be crowd-funded!

For now, maybe saving the date for https://www.graphicsprogrammingconference.nl/ is a good step? 🙂

Graphics Programming Conference

The Graphics Programming Conference is a conference for graphics programmers, from student to industry veteran, taking place in November in Europe.

@TheIneQuation @jon_valdes @BartWronski @aras @demofox Digital Dragons, too, not far from Aras, but I think in 2 days so maybe not enough heads up :) It's a cool little conference though