New dithering method dropped

I call it Surface-Stable Fractal Dithering and I've released it as open source along with this explainer video of how it works.

Explainer video:
https://www.youtube.com/watch?v=HPqGaIMVuLs

Source repository:
https://github.com/runevision/Dither3D

#gamedev #vfx

Surface-Stable Fractal Dithering Explained

YouTube

Several people have suggested I write a paper / white paper / Siggraph submission about Surface-Stable Fractal Dithering. But I haven't written in academic style since my Master's thesis 15 years ago, and don't really want to either.

In case anyone with experience writing such papers might be interested in writing a paper about Surface-Stable Fractal Dithering with me as consultant and co-author (if such an arrangement is a thing), I'd be up for discussing that.

@runevision It's not my scene, but I'll tag @topher_batty, he might know someone who could benefit from being part of a publication like that.
@runevision You don't actually have to write in academic style, as long as you clearly explain what it is you're solving, how you're solving it, and what's interesting about your particular approach.
@StompyRobot Are there any papers you could point me to as reference/examples of what you mean by not having to write in an academic style? I thought for example that it's mandatory to have a survey of existing work, which means reading and referencing other academic papers, which in itself is a very academic writing thing to do.

@runevision @StompyRobot just watched the video and came here to suggest you publish this, only to find others had suggested the same thing!

I would suggest you look into submitting to #JCGT. JCGT (https://jcgt.org/index.html) is aimed to be much more of a practitioners' journal than an academic one. It's meant for modest, practical contributions that clearly work. It values reproducibility highly, but unlike SIGGRAPH, it doesn’t expect comparisons to 10 different techniques (just the most relevant baseline). Think tech blog post++ in archival form. See bullet points here: https://jcgt.org/write.html.

Journal of Computer Graphics Techniques

This is the review form we use, which gives reviewers (and authors) a sense of how JCGT papers are evaluated for publication. https://jcgt.org/files/review-form.txt
@wjarosz Thanks for the suggestion. It looks nice, but one snag is that my source code can't count for their submission. "Any source code and data provided must be under a non-restrictive Open Source license (e.g., MIT, BSD) so that it is directly usable by readers. Papers with usable source code are much more likely to be accepted." Mine is licensed under Mozilla Public License 2.0 which has some minor restrictions.
@wjarosz I use the MPL license because it's permissive enough that people can use the work in their commercial and proprietary products if they want, but if they make improvements to the specific thing I shared, they just need to share their improvements too. To me, this is a modest and fair ask. Is there anyone from the JCGT editorial board or advisory board I could ask about whether the MPL license would be acceptable for the source code of a submission? Is it something you could answer?
@runevision @wjarosz I think you should stick with your principles. If JCGT doesn’t like it, it’s their loss.
@castano @wjarosz Cheers, I will. I just want to hear if it could still be acceptable for them, or not.
@runevision @wjarosz you are pre-supposing they don't like the MPL, which seems like a kind of self-sabotage?
MPL is plenty free and unrestricted enough IMO, so you could just submit. What's the worst that could happen? You already did all the work...

@StompyRobot @wjarosz I’m not pre-supposing. MPL is open but not *non-restrictive* so does not live up to their stated criteria. It would require a dispensation, or they should clarify their criteria otherwise.

I’ve written neither paper nor abstract. I’m actually not clear on whether the expectation is that a paper has already been written when submitting an abstract.

@runevision @wjarosz They didn't say it was a hard rule, either, only a preference. If it were me I would submit first, and discuss later if they make an issue of it. Their stated reason is compatible with MPL: anyone can drop the code into their game without worry of "infection."

@runevision @wjarosz but, you are not me :-)

On a related note, did you look into whether MIP map hardware could help implement this mechanism? It feels tantalizingly close.

@StompyRobot @wjarosz There’s a few notes on mipmaps near the bottom in an FAQ I wrote here:

https://github.com/runevision/Dither3D/discussions/12

Frequenctly asked questions · runevision Dither3D · Discussion #12

Please see this FAQ before asking new questions about the technique, and before making suggestions, thanks! What exactly is meant by surface-stable? Here is how I define surface-stable: A specific ...

GitHub

@runevision @wjarosz hmm. Not sure I agree with "Mipmaps cannot be used to produce fractal-like detail that keeps being crisp at any zoom level."

I'll reread the original derivation and see if I can find something I'm missing.

@wjarosz @runevision @StompyRobot seconded for JCGT! It tends to have very "short and to the point, practical" papers. I have reviewed several in the past, and the review process is very much oriented to "does this make sense, and is it practical?", instead of "bbut is this Proper Science and not Merely Engineering" that some other venues employ.

@runevision anything between practitioners reports to collection books like the "gems" series.
Or trade websites!
The post-dead-tree world is more fragmented, but there's plenty of opportunity to let more people see your work.

Btw, as I asked in the YouTube comments: I wonder how much of this could be done with MIP map hardware? It's already does slice selection and gradient computation for you.

I wrote an FAQ here. Mostly to handle the barrage of questions and suggestions in YouTube comments, but others might find it interesting too.
https://github.com/runevision/Dither3D/discussions/12
Frequenctly asked questions · runevision Dither3D · Discussion #12

Please see this FAQ before asking new questions about the technique, and before making suggestions, thanks! What exactly is meant by surface-stable? Here is how I define surface-stable: A specific ...

GitHub
@runevision By the way, Hi-Fi Rush did successfully use triplanar mapping for their surface-stable half-tone dots shader but it's only used for parts of the environment and so most surfaces are already naturally aligned to the three planes and the transitions therefore aren't very noticeable or distracting. Obviously a different problem than yours since they're not trying to use it as a wholesale rendering technique, just as an effect.
@pervognsen Thanks, just watched the talk. Right, seems like they don't even do triplanar blending but just pick the closest aligned axis. I can definitely see that working for well-controlled use cases. It's also not fractal of course - quite a different problem, as you say.
@runevision Oh excellent, this looks so good! I tried something similar with hierarchical blue noise but couldn’t quite get it working right. Mine was just curiosity toy though.
@slembcke Thanks! Yeah I’ve been wondering too if it could work for blue noise, but it seems like an insanely over-constrained problem.
@runevision It's so straightforward, I don't know how the Obra Dinn devs didn't think of it. 😅

@Craigp @runevision

Well question would go to: Lucas Pope @dukope

@scyzoryk @Craigp @dukope Well to get serious, like I say in the video, I don’t actually think my dithering would have been better to use in Obra Dinn. But anyway, hi Lucas! Thanks for inspiring me and hope you like the video and how it covers Obra Dinn dithering.

@runevision @Craigp @dukope

Yeah i don’t see Obra Dinn working as well with this technique. Seems too deliberate a choice for the author

@runevision Incredible work! Looks absolutely fantastic and “Johansen Fractal Dither” is a great name.
@runevision @scyzoryk @Craigp @dukope I haven't watched your video yet, but my guess is this is object space/texture space versus screen space for Obra Dinn. I like the way Obra Dinn looks because it is a proper screen space dithering where every dither point has the same size. Looks really like an old Mac or Hercules graphics game.
@root42 Yep, all that is covered in the video :)
@runevision good to know! Will put it on my watch later list...
@runevision @scyzoryk @Craigp @dukope It would be interesting to know if subsurface fractal rendering allows recognisable faces (as in Obra Dinn), or if different dithering patterns can be used for different materials!
@runevision Wow, I bet it looks awesome in stereo and VR
@PatHightree I haven’t tried. At first I thought it might looks weird at shallow angles of vertical surfaces, where the UV frequency differs per eye which would cause more dots to be shown for one eye than the other. But actually for anisotropic frequencies I use the lowest frequency, which should be the same for both eyes, so it should be fine.
@runevision Ah yes, I hadn't though about whether they'd be coherent for left and right eye. If so, it would produce a LOT of stereoscopic cues which are spatially coherent. Just like how particle systems are major spatial eye candy in VR
@runevision Gaming for e-ink displays!
@aegir @runevision I tested watching the videos on my Boox color tablet. The ghosting is a bit of an issue! Would be pretty interesting to try viewing actual rendered graphics in native resolution on the eink tablet.
@runevision I wonder if it would be possible to implement this on the #Playdate. I've seen some limited 3D games on it before, but it really pushes the limits of the hardware.
@arnelson Right, I don’t have a Playdate and don’t know much about it’s capabilities but if others want to test it I’d definitely love to see the results.
@runevision @arnelson my guess: probably would not fit within CPU budget (  does not have a GPU). You can barely do 3D to begin with, and then almost everyone slaps a screen-fixed blue noise texture and calls it a day. I did so too here https://aras-p.info/blog/2024/05/20/Crank-the-World-Playdate-demo/
Crank the World: Playdate demo · Aras' website

Aras' website
@aras @runevision @arnelson yep, there are some concepts in this project that would be useful for making the playdate graphics look nice but the whole shader isn't going to work.
@runevision great explanation, thanks! Love the short two year pause :)
@runevision Woah, that looks incredible! 🤩
@runevision Really nice effect, and excellent explainer video!
@runevision as a variation - you can apply this same concept to draw lines along u and v contours for a surface stable hatching/crosshatched effect
@flightreflex Yeah I've seen some effects a bit like this. Probably my shader could do this just by sampling with the u coordinate set to zero and then the v coordinate set to zero, and combine the results.
@runevision this is super cool omg
@runevision Thanks for making this video and presenting the technique so elegantly! Definitely earned my subscription, and I will reference this if I ever need to write a dithering effect.
@runevision Wow, this is incredibly cool! Vildt fantastisk arbejde!
Teenage Mutant Ninja Turtles Logo Generator

Create your own Teenage Mutant Ninja Turtles-style logo

@vfig Hah, fun! I guess the syllables could line up better if we go with just "Dither"? I could never figure out when to use "dither" vs "dithering" anyway.
http://glench.com/tmnt/#Surface-Stable_Fractal_Dither
Teenage Mutant Ninja Turtles Logo Generator

Create your own Teenage Mutant Ninja Turtles-style logo

@runevision yeah. i had “surface stable fractal dither” running through my head for a bit until i realised what its rhythm reminded me of—but then i checked and saw your title had “dithering”, so went with that for, well, ‘correctness’ i suppose.